Update README.md
Browse files
README.md
CHANGED
|
@@ -31,6 +31,14 @@ mistral_models_path.mkdir(parents=True, exist_ok=True)
|
|
| 31 |
snapshot_download(repo_id="mistralai/Mistral-7B-Instruct-v0.3", allow_patterns=["params.json", "consolidated.safetensors", "tokenizer.model.v3"], cache_dir=mistral_models_path)
|
| 32 |
```
|
| 33 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 34 |
### Instruct following
|
| 35 |
|
| 36 |
```py
|
|
|
|
| 31 |
snapshot_download(repo_id="mistralai/Mistral-7B-Instruct-v0.3", allow_patterns=["params.json", "consolidated.safetensors", "tokenizer.model.v3"], cache_dir=mistral_models_path)
|
| 32 |
```
|
| 33 |
|
| 34 |
+
### Chat
|
| 35 |
+
|
| 36 |
+
After installing `mistral_inference`, a `mistral-chat` command should be available in your CLI. You can chat with the model using
|
| 37 |
+
|
| 38 |
+
```
|
| 39 |
+
mistral-chat $HOME/mistral_models/7B-Instruct-v0.3
|
| 40 |
+
```
|
| 41 |
+
|
| 42 |
### Instruct following
|
| 43 |
|
| 44 |
```py
|