Commit
•
78814a9
1
Parent(s):
dbea3ef
Update README.md (#15)
Browse files- Update README.md (8959b31e520b808acd9ad08db60880e5b8677b85)
Co-authored-by: Devendra Singh Chaplot <devendrachaplot@users.noreply.huggingface.co>
README.md
CHANGED
@@ -18,6 +18,27 @@ Mistral-7B-v0.1 is a transformer model, with the following architecture choices:
|
|
18 |
- Sliding-Window Attention
|
19 |
- Byte-fallback BPE tokenizer
|
20 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
21 |
## Notice
|
22 |
|
23 |
Mistral 7B is a pretrained base model and therefore does not have any moderation mechanisms.
|
|
|
18 |
- Sliding-Window Attention
|
19 |
- Byte-fallback BPE tokenizer
|
20 |
|
21 |
+
## Troubleshooting
|
22 |
+
- If you see the following error:
|
23 |
+
```
|
24 |
+
Traceback (most recent call last):
|
25 |
+
File "", line 1, in
|
26 |
+
File "/transformers/models/auto/auto_factory.py", line 482, in from_pretrained
|
27 |
+
config, kwargs = AutoConfig.from_pretrained(
|
28 |
+
File "/transformers/models/auto/configuration_auto.py", line 1022, in from_pretrained
|
29 |
+
config_class = CONFIG_MAPPING[config_dict["model_type"]]
|
30 |
+
File "/transformers/models/auto/configuration_auto.py", line 723, in getitem
|
31 |
+
raise KeyError(key)
|
32 |
+
KeyError: 'mistral'
|
33 |
+
```
|
34 |
+
|
35 |
+
Installing transformers from source should solve the issue:
|
36 |
+
```
|
37 |
+
pip install git+https://github.com/huggingface/transformers
|
38 |
+
```
|
39 |
+
This should not be required after transformers-v4.33.4.
|
40 |
+
|
41 |
+
|
42 |
## Notice
|
43 |
|
44 |
Mistral 7B is a pretrained base model and therefore does not have any moderation mechanisms.
|