MaziyarPanahi commited on
Commit
1566f67
1 Parent(s): c21d268

Correct number of parameters (#26)

Browse files

- Correct number of parameters (e57b55a07e49b5ea5e7d0135560eaa993b0e868f)

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -32,7 +32,7 @@ language:
32
 
33
  On April 10th, [@MistralAI](https://huggingface.co/mistralai) released a model named "Mixtral 8x22B," an 176B MoE via magnet link (torrent):
34
 
35
- - 176B MoE with ~40B active
36
  - Context length of 65k tokens
37
  - The base model can be fine-tuned
38
  - Requires ~260GB VRAM in fp16, 73GB in int4
 
32
 
33
  On April 10th, [@MistralAI](https://huggingface.co/mistralai) released a model named "Mixtral 8x22B," an 176B MoE via magnet link (torrent):
34
 
35
+ - 141B MoE with ~35B active
36
  - Context length of 65k tokens
37
  - The base model can be fine-tuned
38
  - Requires ~260GB VRAM in fp16, 73GB in int4