Spaces:
No application file
No application file
File size: 3,720 Bytes
a85c9b8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 |
---
title: 'π Streamlit'
description: 'Integrate with Streamlit to plug and play with any LLM'
---
In this example, we will learn how to use `mistralai/Mixtral-8x7B-Instruct-v0.1` and Embedchain together with Streamlit to build a simple RAG chatbot.

## Setup
Install Embedchain and Streamlit.
```bash
pip install embedchain streamlit
```
<Tabs>
<Tab title="app.py">
```python
import os
from embedchain import App
import streamlit as st
with st.sidebar:
huggingface_access_token = st.text_input("Hugging face Token", key="chatbot_api_key", type="password")
"[Get Hugging Face Access Token](https://huggingface.co/settings/tokens)"
"[View the source code](https://github.com/embedchain/examples/mistral-streamlit)"
st.title("π¬ Chatbot")
st.caption("π An Embedchain app powered by Mistral!")
if "messages" not in st.session_state:
st.session_state.messages = [
{
"role": "assistant",
"content": """
Hi! I'm a chatbot. I can answer questions and learn new things!\n
Ask me anything and if you want me to learn something do `/add <source>`.\n
I can learn mostly everything. :)
""",
}
]
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])
if prompt := st.chat_input("Ask me anything!"):
if not st.session_state.chatbot_api_key:
st.error("Please enter your Hugging Face Access Token")
st.stop()
os.environ["HUGGINGFACE_ACCESS_TOKEN"] = st.session_state.chatbot_api_key
app = App.from_config(config_path="config.yaml")
if prompt.startswith("/add"):
with st.chat_message("user"):
st.markdown(prompt)
st.session_state.messages.append({"role": "user", "content": prompt})
prompt = prompt.replace("/add", "").strip()
with st.chat_message("assistant"):
message_placeholder = st.empty()
message_placeholder.markdown("Adding to knowledge base...")
app.add(prompt)
message_placeholder.markdown(f"Added {prompt} to knowledge base!")
st.session_state.messages.append({"role": "assistant", "content": f"Added {prompt} to knowledge base!"})
st.stop()
with st.chat_message("user"):
st.markdown(prompt)
st.session_state.messages.append({"role": "user", "content": prompt})
with st.chat_message("assistant"):
msg_placeholder = st.empty()
msg_placeholder.markdown("Thinking...")
full_response = ""
for response in app.chat(prompt):
msg_placeholder.empty()
full_response += response
msg_placeholder.markdown(full_response)
st.session_state.messages.append({"role": "assistant", "content": full_response})
```
</Tab>
<Tab title="config.yaml">
```yaml
app:
config:
name: 'mistral-streamlit-app'
llm:
provider: huggingface
config:
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1'
temperature: 0.1
max_tokens: 250
top_p: 0.1
stream: true
embedder:
provider: huggingface
config:
model: 'sentence-transformers/all-mpnet-base-v2'
```
</Tab>
</Tabs>
## To run it locally,
```bash
streamlit run app.py
```
|