Tuana commited on
Commit
171d07b
1 Parent(s): 459807c

port to 2.0

Browse files
Files changed (5) hide show
  1. README.md +5 -5
  2. app.py +1 -1
  3. requirements.txt +2 -3
  4. utils/haystack.py +40 -30
  5. utils/ui.py +2 -2
README.md CHANGED
@@ -17,22 +17,22 @@ pinned: false
17
  ##### A simple app to get an overview of what the Mastodon user has been posting about and their tone
18
 
19
  This is a demo just for fun 🥳
20
- This repo contains a streamlit application that given a Mastodon username, tells you what type of things they've been posting about lately, their tone, and the languages they use. It uses the LLM by OpenAI `text-davinci-003`.
21
 
22
- It's been built with [Haystack](https://haystack.deepset.ai) using the [`PromptNode`](https://docs.haystack.deepset.ai/docs/prompt_node) and by creating a custom [`PromptTemplate`](https://docs.haystack.deepset.ai/docs/prompt_node#templates)
23
 
24
  https://user-images.githubusercontent.com/15802862/220464834-f42c038d-54b4-4d5e-8d59-30d95143b616.mov
25
 
26
 
27
  ### Points of improvement
28
 
29
- Since we're using a generative model here, we need to be a bit creative with the prompt we provide it to minimize any hallucination or similar unwanted results. For this reason, I've tried to be a bit creative with the `PromptTemplate` and give some examples of _how_ to construct a summary. However, this still sometimes produces odd results.
30
 
31
  If you try to run it yourself and find ways to make this app better, please feel free to create an issue/PR 🙌
32
 
33
- ## To learn more about the PromptNode
34
 
35
- Check out our tutorial on the PromptNode and how to create your own templates [here](https://haystack.deepset.ai/tutorials/21_customizing_promptnode)
36
 
37
  ## Installation and Running
38
  To run the bare application which does _nothing_:
 
17
  ##### A simple app to get an overview of what the Mastodon user has been posting about and their tone
18
 
19
  This is a demo just for fun 🥳
20
+ This repo contains a streamlit application that given a Mastodon username, tells you what type of things they've been posting about lately, their tone, and the languages they use. It uses the LLM by OpenAI `gpt-4`.
21
 
22
+ It's been built with [Haystack](https://haystack.deepset.ai) using the [`OpenAIGenerator`](https://docs.haystack.deepset.ai/v2.0/docs/openaigenerator) and by creating a [`PromptBuilder`](https://docs.haystack.deepset.ai/v2.0/docs/promptbuilder)
23
 
24
  https://user-images.githubusercontent.com/15802862/220464834-f42c038d-54b4-4d5e-8d59-30d95143b616.mov
25
 
26
 
27
  ### Points of improvement
28
 
29
+ Since we're using a generative model here, we need to be a bit creative with the prompt we provide it to minimize any hallucination or similar unwanted results. For this reason, I've tried to be a bit creative with the `PromptBuilder` template and give some examples of _how_ to construct a summary. However, this still sometimes produces odd results.
30
 
31
  If you try to run it yourself and find ways to make this app better, please feel free to create an issue/PR 🙌
32
 
33
+ ## To learn more about the PromptBuilder
34
 
35
+ As of Haystack 2.0-Beta onwards, you can create prompt templates with Jinja. Check out guide on creating prompts [here](https://docs.haystack.deepset.ai/v2.0/docs/promptbuilder)
36
 
37
  ## Installation and Running
38
  To run the bare application which does _nothing_:
app.py CHANGED
@@ -54,5 +54,5 @@ if st.session_state.get("api_key_configured"):
54
 
55
  if st.session_state.result:
56
  voice = st.session_state.result
57
- st.write(voice['results'][0])
58
 
 
54
 
55
  if st.session_state.result:
56
  voice = st.session_state.result
57
+ st.write(voice[0])
58
 
requirements.txt CHANGED
@@ -1,7 +1,6 @@
1
- safetensors==0.3.3.post1
2
- farm-haystack==1.20.0
3
  streamlit==1.21.0
4
  markdown
5
  st-annotated-text
6
  python-dotenv
7
- mastodon-fetcher-haystack==0.0.1
 
1
+ haystack-ai==2.0.0b4
 
2
  streamlit==1.21.0
3
  markdown
4
  st-annotated-text
5
  python-dotenv
6
+ mastodon-fetcher-haystack
utils/haystack.py CHANGED
@@ -1,51 +1,61 @@
1
  import streamlit as st
2
  from mastodon_fetcher_haystack.mastodon_fetcher import MastodonFetcher
3
  from haystack import Pipeline
4
- from haystack.nodes import PromptNode, PromptTemplate
 
5
 
6
  def start_haystack(openai_key):
7
  #Use this function to contruct a pipeline
8
  fetcher = MastodonFetcher()
9
 
10
- mastodon_template = PromptTemplate(prompt="""You will be given a post stream belonging to a specific Mastodon profile. Answer with a summary of what they've lately been posting about and in what languages.
11
- You may go into some detail about what topics they tend to like postint about. Please also mention their overall tone, for example: positive,
12
- negative, political, sarcastic or something else.
13
-
14
- Examples:
15
-
16
- Post stream: [@deepset_ai](https://mastodon.social/@deepset_ai): Come join our Haystack server for our first Discord event tomorrow, a deepset AMA session with @rusic_milos @malte_pietsch…
17
- [@deepset_ai](https://mastodon.social/@deepset_ai): Join us for a chat! On Thursday 25th we are hosting a 'deepset - Ask Me Anything' session on our brand new Discord. Come…
18
- [@deepset_ai](https://mastodon.social/@deepset_ai): Curious about how you can use @OpenAI GPT3 in a Haystack pipeline? This week we released Haystack 1.7 with which we introdu…
19
- [@deepset_ai](https://mastodon.social/@deepset_ai): So many updates from @deepset_ai today!
20
-
21
- Summary: This user has lately been reposting posts from @deepset_ai. The topics of the posts have been around the Haystack community, NLP and GPT. They've
22
- been posting in English, and have had a positive, informative tone.
23
-
24
- Post stream: I've directed my team to set sharper rules on how we deal with unidentified objects.\n\nWe will inventory, improve ca…
25
- the incursion by China’s high-altitude balloon, we enhanced radar to pick up slower objects.\n \nBy doing so, w…
26
- I gave an update on the United States’ response to recent aerial objects.
27
-
28
- Summary: This user has lately been posting about having sharper rules to deal with unidentified objects and an incursuin by China's high-altitude
29
- baloon. Their pots have mostly been neutral but determined in tone. They mostly post in English.
30
-
31
- Post stream: {join(documents)}
32
-
33
- Summary:
34
- """)
35
- prompt_node = PromptNode(model_name_or_path="gpt-4", default_prompt_template=mastodon_template, api_key=openai_key)
 
36
 
37
  st.session_state["haystack_started"] = True
38
 
39
  mastodon_pipeline = Pipeline()
40
- mastodon_pipeline.add_node(component=fetcher, name="MastodonFetcher", inputs=["Query"])
41
- mastodon_pipeline.add_node(component=prompt_node, name="PromptNode", inputs=["MastodonFetcher"])
 
 
 
 
 
 
42
  return mastodon_pipeline
43
 
44
 
45
  @st.cache_data(show_spinner=True)
46
  def query(username, _pipeline):
47
  try:
48
- result = _pipeline.run(query=username, params={"MastodonFetcher": {"last_k_posts": 20}})
 
 
49
  except Exception as e:
50
  result = ["Please make sure you are providing a correct, public Mastodon account"]
51
  return result
 
1
  import streamlit as st
2
  from mastodon_fetcher_haystack.mastodon_fetcher import MastodonFetcher
3
  from haystack import Pipeline
4
+ from haystack.components.generators import OpenAIGenerator
5
+ from haystack.components.builders import PromptBuilder
6
 
7
  def start_haystack(openai_key):
8
  #Use this function to contruct a pipeline
9
  fetcher = MastodonFetcher()
10
 
11
+ mastodon_template = """You will be given a post stream belonging to a specific Mastodon profile. Answer with a summary of what they've lately been posting about and in what languages.
12
+ You may go into some detail about what topics they tend to like postint about. Please also mention their overall tone, for example: positive,
13
+ negative, political, sarcastic or something else.
14
+
15
+ Examples:
16
+
17
+ Post stream: [@deepset_ai](https://mastodon.social/@deepset_ai): Come join our Haystack server for our first Discord event tomorrow, a deepset AMA session with @rusic_milos @malte_pietsch…
18
+ [@deepset_ai](https://mastodon.social/@deepset_ai): Join us for a chat! On Thursday 25th we are hosting a 'deepset - Ask Me Anything' session on our brand new Discord. Come…
19
+ [@deepset_ai](https://mastodon.social/@deepset_ai): Curious about how you can use @OpenAI GPT3 in a Haystack pipeline? This week we released Haystack 1.7 with which we introdu…
20
+ [@deepset_ai](https://mastodon.social/@deepset_ai): So many updates from @deepset_ai today!
21
+
22
+ Summary: This user has lately been reposting posts from @deepset_ai. The topics of the posts have been around the Haystack community, NLP and GPT. They've
23
+ been posting in English, and have had a positive, informative tone.
24
+
25
+ Post stream: I've directed my team to set sharper rules on how we deal with unidentified objects.\n\nWe will inventory, improve ca…
26
+ the incursion by China’s high-altitude balloon, we enhanced radar to pick up slower objects.\n \nBy doing so, w…
27
+ I gave an update on the United States’ response to recent aerial objects.
28
+
29
+ Summary: This user has lately been posting about having sharper rules to deal with unidentified objects and an incursuin by China's high-altitude
30
+ baloon. Their pots have mostly been neutral but determined in tone. They mostly post in English.
31
+
32
+ Post stream: {{ documents }}
33
+
34
+ Summary:
35
+ """
36
+ prompt_builder = PromptBuilder(template=mastodon_template)
37
+ llm = OpenAIGenerator(model_name="gpt-4", api_key=openai_key)
38
 
39
  st.session_state["haystack_started"] = True
40
 
41
  mastodon_pipeline = Pipeline()
42
+ mastodon_pipeline.add_component("fetcher", fetcher)
43
+ mastodon_pipeline.add_component("prompt_builder", prompt_builder)
44
+ mastodon_pipeline.add_component("llm", llm)
45
+
46
+
47
+ mastodon_pipeline.connect("fetcher.documents", "prompt_builder.documents")
48
+ mastodon_pipeline.connect("prompt_builder.prompt", "llm.prompt")
49
+
50
  return mastodon_pipeline
51
 
52
 
53
  @st.cache_data(show_spinner=True)
54
  def query(username, _pipeline):
55
  try:
56
+ replies = _pipeline.run(data={"fetcher": {"username": username,
57
+ "last_k_posts": 20}})
58
+ result = replies['llm']['replies']
59
  except Exception as e:
60
  result = ["Please make sure you are providing a correct, public Mastodon account"]
61
  return result
utils/ui.py CHANGED
@@ -46,8 +46,8 @@ def sidebar():
46
  st.markdown("---")
47
  st.markdown(
48
  "## How this works\n"
49
- "This app was built with [Haystack](https://haystack.deepset.ai) using the"
50
- " [`PromptNode`](https://docs.haystack.deepset.ai/docs/prompt_node) and custom [`PromptTemplate`](https://docs.haystack.deepset.ai/docs/prompt_node#templates).\n\n"
51
  " The source code is also on [GitHub](https://github.com/TuanaCelik/should-i-follow)"
52
  " with instructions to run locally.\n"
53
  "You can see how the `PromptNode` was set up [here](https://github.com/TuanaCelik/should-i-follow/blob/main/utils/haystack.py)")
 
46
  st.markdown("---")
47
  st.markdown(
48
  "## How this works\n"
49
+ "This app was built with [Haystack 2.0-Beta](https://haystack.deepset.ai) using the"
50
+ " [`OpenAIGenerator`](https://docs.haystack.deepset.ai/v2.0/docs/openaigenerator) and [`PromptBuilder`](https://docs.haystack.deepset.ai/v2.0/docs/promptbuilder).\n\n"
51
  " The source code is also on [GitHub](https://github.com/TuanaCelik/should-i-follow)"
52
  " with instructions to run locally.\n"
53
  "You can see how the `PromptNode` was set up [here](https://github.com/TuanaCelik/should-i-follow/blob/main/utils/haystack.py)")