hendrydong
commited on
Commit
•
2e78e24
1
Parent(s):
7906900
Update README.md
Browse files
README.md
CHANGED
@@ -18,41 +18,36 @@ This is the model card of a 🤗 transformers model that has been pushed on the
|
|
18 |
- **License:** [More Information Needed]
|
19 |
- **Finetuned from model [optional]:** [More Information Needed]
|
20 |
|
21 |
-
### Model Sources [optional]
|
22 |
-
|
23 |
-
<!-- Provide the basic links for the model. -->
|
24 |
-
|
25 |
-
- **Repository:** [More Information Needed]
|
26 |
-
- **Paper [optional]:** [More Information Needed]
|
27 |
-
- **Demo [optional]:** [More Information Needed]
|
28 |
|
29 |
## Uses
|
30 |
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
|
|
|
|
56 |
|
57 |
### Recommendations
|
58 |
|
|
|
18 |
- **License:** [More Information Needed]
|
19 |
- **Finetuned from model [optional]:** [More Information Needed]
|
20 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
21 |
|
22 |
## Uses
|
23 |
|
24 |
+
The usage and chat template format follow the SFT model `HuggingFaceH4/mistral-7b-sft-beta`.
|
25 |
+
|
26 |
+
```python
|
27 |
+
# Install transformers from source - only needed for versions <= v4.34
|
28 |
+
# pip install git+https://github.com/huggingface/transformers.git
|
29 |
+
# pip install accelerate
|
30 |
+
|
31 |
+
import torch
|
32 |
+
from transformers import pipeline
|
33 |
+
|
34 |
+
pipe = pipeline("text-generation", model="sfairXC/FsfairX-Zephyr-Chat-v0.1", torch_dtype=torch.bfloat16, device_map="auto")
|
35 |
+
|
36 |
+
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
|
37 |
+
messages = [
|
38 |
+
{"role": "system", "content": "You are a friendly chatbot who always responds in the style of a pirate"},
|
39 |
+
{"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
|
40 |
+
]
|
41 |
+
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
|
42 |
+
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
|
43 |
+
print(outputs[0]["generated_text"])
|
44 |
+
# <|system|>
|
45 |
+
# You are a friendly chatbot who always responds in the style of a pirate.</s>
|
46 |
+
# <|user|>
|
47 |
+
# How many helicopters can a human eat in one sitting?</s>
|
48 |
+
# <|assistant|>
|
49 |
+
# Ah, me hearty matey! But yer question be a puzzler! A human cannot eat a helicopter in one sitting, as helicopters are not edible. They be made of metal, plastic, and other materials, not food!
|
50 |
+
```
|
51 |
|
52 |
### Recommendations
|
53 |
|