Spaces:
Sleeping
Sleeping
A newer version of the Gradio SDK is available:
5.29.0
metadata
title: Chatbot example
emoji: 💬
colorFrom: yellow
colorTo: purple
sdk: gradio
sdk_version: 5.0.1
app_file: app.py
pinned: false
short_description: Chatbot with Hugging Face Spaces & Gradio
This is an interactive chatbot deployed on Hugging Face Spaces using Gradio. It supports both Cohere and Hugging Face models, allowing users to select between them for generating responses. Gradio, huggingface_hub
, and the Hugging Face Inference API with optional Cohere API.
Features
✅ Supports two AI models:
- Cohere API (
command-r-plus
) - Hugging Face API (
mistralai/Mistral-7B-Instruct-v0.3
) ✅ Customizable Settings: - System prompt
- Max tokens
- Temperature
- Top-p value ✅ Streaming responses (for Hugging Face models) ✅ Gradio-powered UI for easy interaction
Installation & Setup
1️⃣ Clone the Repository
git clone https://huggingface.co/spaces/your-space-name
cd your-space-name
2️⃣ Install Dependencies
Make sure you have Python installed, then run:
pip install -r requirements.txt
3️⃣ Set API Keys
You need to set up API keys for Hugging Face and Cohere. You can do this via environment variables:
export HF_API_KEY='your-huggingface-api-key'
export COHERE_API_KEY='your-cohere-api-key'
4️⃣ Run the App Locally
python app.py
This will launch the Gradio interface in your browser.
Deployment on Hugging Face Spaces
- Create a new Space on Hugging Face.
- Choose Gradio as the framework.
- Upload
app.py
andrequirements.txt
. - Deploy and test the chatbot.
Usage
- Enter your message in the chatbox.
- Choose the AI model (Hugging Face or Cohere).
- Adjust chatbot parameters as needed.
- Receive and interact with AI-generated responses.
Code Overview
API Clients
from huggingface_hub import InferenceClient
import cohere
client_hf = InferenceClient(model='mistralai/Mistral-7B-Instruct-v0.3', token=HF_API_KEY)
client_cohere = cohere.Client(COHERE_API_KEY)
Chatbot Function
def respond(message: str, history: list, system_message: str, max_tokens: int, temperature: float, top_p: float, use_cohere: bool):
messages = [{"role": "system", "content": system_message}]
for val in history:
messages.append({"role": "user", "content": val[0]})
messages.append({"role": "assistant", "content": val[1]})
messages.append({"role": "user", "content": message})
Gradio UI Setup
demo = gr.ChatInterface(
respond,
additional_inputs=[
gr.Textbox(value='You are a friendly Chatbot.', label='System prompt'),
gr.Slider(minimum=1, maximum=2048, value=512, step=1, label='Max new tokens'),
gr.Slider(minimum=0.1, maximum=4.0, value=0.7, step=0.1, label='Temperature'),
gr.Slider(minimum=0.1, maximum=1.0, value=0.95, step=0.05, label='Top-p'),
gr.Checkbox(label='Use Cohere model instead.'),
],
)
License
This project is licensed under the MIT License.
Author
👤 Your Name
📧 Contact: gabor.toth.103@gmail.com