File size: 3,294 Bytes
066f025
2186cc9
066f025
 
 
 
 
 
 
2186cc9
066f025
2186cc9
396fb6e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
---
title: Chatbot example
emoji: 💬
colorFrom: yellow
colorTo: purple
sdk: gradio
sdk_version: 5.0.1
app_file: app.py
pinned: false
short_description: Chatbot with Hugging Face Spaces & Gradio 
---
This is an interactive chatbot deployed on **Hugging Face Spaces** using **Gradio**. It supports both **Cohere** and **Hugging Face** models, allowing users to select between them for generating responses. [Gradio](https://gradio.app), [`huggingface_hub`](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/index), and the [Hugging Face Inference API](https://huggingface.co/docs/api-inference/index) with optional Cohere API.

## Features**Supports two AI models:**
   - Cohere API (`command-r-plus`)
   - Hugging Face API (`mistralai/Mistral-7B-Instruct-v0.3`)
✅ **Customizable Settings:**
   - System prompt
   - Max tokens
   - Temperature
   - Top-p value
✅ **Streaming responses** (for Hugging Face models)
✅ **Gradio-powered UI** for easy interaction

## Installation & Setup

### 1️⃣ Clone the Repository
```bash
git clone https://huggingface.co/spaces/your-space-name
cd your-space-name
```

### 2️⃣ Install Dependencies
Make sure you have Python installed, then run:
```bash
pip install -r requirements.txt
```

### 3️⃣ Set API Keys
You need to set up API keys for **Hugging Face** and **Cohere**. You can do this via environment variables:
```bash
export HF_API_KEY='your-huggingface-api-key'
export COHERE_API_KEY='your-cohere-api-key'
```

### 4️⃣ Run the App Locally
```bash
python app.py
```
This will launch the Gradio interface in your browser.

## Deployment on Hugging Face Spaces
1. Create a new **Space** on Hugging Face.
2. Choose **Gradio** as the framework.
3. Upload `app.py` and `requirements.txt`.
4. Deploy and test the chatbot.

## Usage
1. Enter your message in the chatbox.
2. Choose the AI model (Hugging Face or Cohere).
3. Adjust chatbot parameters as needed.
4. Receive and interact with AI-generated responses.

## Code Overview

### API Clients
```python
from huggingface_hub import InferenceClient
import cohere

client_hf = InferenceClient(model='mistralai/Mistral-7B-Instruct-v0.3', token=HF_API_KEY)
client_cohere = cohere.Client(COHERE_API_KEY)
```

### Chatbot Function
```python
def respond(message: str, history: list, system_message: str, max_tokens: int, temperature: float, top_p: float, use_cohere: bool):
    messages = [{"role": "system", "content": system_message}]
    for val in history:
        messages.append({"role": "user", "content": val[0]})
        messages.append({"role": "assistant", "content": val[1]})
    messages.append({"role": "user", "content": message})
```

### Gradio UI Setup
```python
demo = gr.ChatInterface(
    respond,
    additional_inputs=[
        gr.Textbox(value='You are a friendly Chatbot.', label='System prompt'),
        gr.Slider(minimum=1, maximum=2048, value=512, step=1, label='Max new tokens'),
        gr.Slider(minimum=0.1, maximum=4.0, value=0.7, step=0.1, label='Temperature'),
        gr.Slider(minimum=0.1, maximum=1.0, value=0.95, step=0.05, label='Top-p'),
        gr.Checkbox(label='Use Cohere model instead.'),
    ],
)
```

## License
This project is licensed under the MIT License.

## Author
👤 **Your Name**  
📧 Contact: gabor.toth.103@gmail.com