Update README.md
Browse files
README.md
CHANGED
@@ -150,11 +150,25 @@ Shake all ingredients in a shaker filled with ice until well chilled and strain
|
|
150 |
It took ~3 hours to train 3 epochs on 1x A100 (40 GB SXM).
|
151 |
|
152 |
Prompt format:
|
153 |
-
This model uses the same prompt format as [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)
|
154 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
155 |
```
|
156 |
-
|
157 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
158 |
|
159 |
## Training Hyperparameters
|
160 |
|
|
|
150 |
It took ~3 hours to train 3 epochs on 1x A100 (40 GB SXM).
|
151 |
|
152 |
Prompt format:
|
153 |
+
This model uses the same prompt format as [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) and does **not** expect a system prompt. This format is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating) via the `apply_chat_template()` method. Here's an illustrative example:
|
154 |
|
155 |
+
```python
|
156 |
+
messages = [
|
157 |
+
{"role": "user", "content": "What is your favourite condiment?"},
|
158 |
+
{"role": "assistant", "content": "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!"},
|
159 |
+
{"role": "user", "content": "Do you have mayonnaise recipes?"}
|
160 |
+
]
|
161 |
+
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
|
162 |
+
print(prompt)
|
163 |
```
|
164 |
+
|
165 |
+
<details>
|
166 |
+
|
167 |
+
<summary>Output</summary>
|
168 |
+
|
169 |
+
**Prompt**: <s>[INST] What is your favourite condiment? [/INST]Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!</s> [INST] Do you have mayonnaise recipes? [/INST]
|
170 |
+
|
171 |
+
</details>
|
172 |
|
173 |
## Training Hyperparameters
|
174 |
|