app

Gradio app.py
from dotenv import load_dotenv
#: eval: false
load_dotenv()
True

Put the chat backend pieces together


ConversationBufferMemory

 ConversationBufferMemory
                           (chat_memory:langchain.schema.memory.BaseChatMe
                           ssageHistory=None,
                           output_key:Optional[str]=None,
                           input_key:Optional[str]=None,
                           return_messages:bool=False,
                           human_prefix:str='Human', ai_prefix:str='AI',
                           memory_key:str='history')

Buffer for storing conversation memory.


ChatMessageHistory

 ChatMessageHistory
                     (messages:List[langchain.schema.messages.BaseMessage]
                     =[])

In memory implementation of chat message history.

Stores messages in an in memory list.


ChatOpenAI

 ChatOpenAI (cache:Optional[bool]=None, verbose:bool=None, callbacks:Union
             [List[langchain.callbacks.base.BaseCallbackHandler],langchain
             .callbacks.base.BaseCallbackManager,NoneType]=None, callback_
             manager:Optional[langchain.callbacks.base.BaseCallbackManager
             ]=None, tags:Optional[List[str]]=None,
             metadata:Optional[Dict[str,Any]]=None, client:Any=None,
             model:str='gpt-3.5-turbo', temperature:float=0.7,
             model_kwargs:Dict[str,Any]=None,
             openai_api_key:Optional[str]=None,
             openai_api_base:Optional[str]=None,
             openai_organization:Optional[str]=None,
             openai_proxy:Optional[str]=None, request_timeout:Union[float,
             Tuple[float,float],NoneType]=None, max_retries:int=6,
             streaming:bool=False, n:int=1, max_tokens:Optional[int]=None,
             tiktoken_model_name:Optional[str]=None)

Wrapper around OpenAI Chat large language models.

To use, you should have the openai python package installed, and the environment variable OPENAI_API_KEY set with your API key.

Any parameters that are valid to be passed to the openai.create call can be passed in, even if not explicitly saved on this class.

Example: .. code-block:: python

    from langchain.chat_models import ChatOpenAI
    openai = ChatOpenAI(model_name="gpt-3.5-turbo")
llm = ChatOpenAI(temperature=1)
MEMORY_KEY = "chat_history"
chat_msgs = INIT_PROMPT.format_prompt(
    ingredients="tofu, pickles, mustard, olives, tomatoes, lettuce, bell peppers, carrots, bread",
    allergies="",
    recipe_freeform_input="The preparation time shVegan spaghetti aglio e olio ould be less than 30 minutes. I really love Thai food!",
)
chat_msgs = chat_msgs.to_messages()
results = llm.generate([chat_msgs])

chat_msgs.append(results.generations[0][0].message)
tools = [vegan_recipe_edamam_search]
prompt = OpenAIFunctionsAgent.create_prompt(
    system_message=INIT_PROMPT.messages[0],
    extra_prompt_messages=chat_msgs + [MessagesPlaceholder(variable_name=MEMORY_KEY)],
)
memory = ConversationBufferMemory(
    chat_memory=ChatMessageHistory(messages=chat_msgs),
    return_messages=True,
    memory_key=MEMORY_KEY,
)
agent_executor = AgentExecutor(
    agent=OpenAIFunctionsAgent(llm=llm, tools=tools, prompt=prompt),
    tools=tools,
    memory=memory,
    verbose=True,
)
memory.chat_memory.messages
[SystemMessage(content='The following is a conversation between a human and a friendly vegan AI. \nThe AI is compassionate to animals and therefore serves to promote veganism.\nKnowledge: A vegan diet implies a plant-based diet avoiding all animal foods such as meat (including fish, shellfish and insects), dairy, eggs and honey.\nIf the human messages are not aligned with veganism, remind them of your purpose.\nThe AI never generates vegan recipes itself but instead uses a tool.', additional_kwargs={}),
 AIMessage(content='What ingredients do you wish to cook with?', additional_kwargs={}, example=False),
 HumanMessage(content='Ingredients: tofu, pickles, mustard, olives, tomatoes, lettuce, bell peppers, carrots, bread', additional_kwargs={}, example=False),
 AIMessage(content='Do you have any allergies I should be aware of?', additional_kwargs={}, example=False),
 HumanMessage(content='Allergies: ', additional_kwargs={}, example=False),
 AIMessage(content='Do you have any preferences I should consider for the recipe such as preparation time, difficulty, or cuisine region?', additional_kwargs={}, example=False),
 HumanMessage(content="Preferences: `The preparation time shVegan spaghetti aglio e olio ould be less than 30 minutes. I really love Thai food!`\nYour task is compose a concise, 6 word max vegan recipe keyword query to use in an API search.\nThink step by step.\n\n1. If the user listed any ingredients, choose the three ingredients that are most commonly used together in recipes that fall within the user's preferences (if any are included). \n2. If the user provided any allergies, include them in the query.\nFormat your response as message with the allergy and diet preferences first and then the ingredients.\nExamples:\n'Vegan gluten-free chicken peppers' or 'Vegan tofu, brocolli, and miso'", additional_kwargs={}, example=False),
 AIMessage(content='Vegan, quick, Thai tofu, bell peppers', additional_kwargs={}, example=False)]
agent_executor.run("Search for vegan recipe")


> Entering new AgentExecutor chain...

Invoking: `vegan_recipe_edamam_search` with `{'query': 'Tofu pickle sandwich with Thai-inspired flavors'}`


[]I apologize, but I couldn't find any vegan recipes matching your query. Can I help you with anything else?

> Finished chain.
"I apologize, but I couldn't find any vegan recipes matching your query. Can I help you with anything else?"
agent_executor.run("Which ingredients that I provided go the best together in dishes?")
NameError: name 'agent_executor' is not defined

source

ConversationBot

 ConversationBot (verbose=True)

Initialize self. See help(type(self)) for accurate signature.

os.listdir(SAMPLE_IMG_DIR)
SAMPLE_IMG_DIR
Path('/home/evylz/AnimalEquality/lv-recipe-chatbot/assets/images/vegan_ingredients')
CPU times: user 6.19 s, sys: 1.47 s, total: 7.66 s
Wall time: 4.68 s
I uploaded an image that may contain vegan ingredients.
The description of the image is: `a refrigerator with food inside`.
The extracted ingredients are:
```
cabbage lettuce onion
apples
rice
plant-based milk
```

CPU times: user 56.7 s, sys: 63.6 ms, total: 56.8 s
Wall time: 5.95 s

source

create_demo

 create_demo (bot=<class '__main__.ConversationBot'>)
if "demo" in globals():
    demo.close()
demo = create_demo(bot)
demo.launch()
Closing server running on port: 7860
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.