Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -24,7 +24,7 @@ tags:
24
 
25
  # Mixtral-8x22B-v0.1-GGUF
26
 
27
- On April 10th, @MistralAI released a model named "Mixtral 8x22B," an 176B MoE via magnet link (torrent):
28
 
29
  - 176B MoE with ~40B active
30
  - Context length of 65k tokens
@@ -95,7 +95,7 @@ Since this appears to be a base model, it will keep on generating.
95
 
96
  - [MistralAI](https://huggingface.co/mistralai) for opening the weights
97
  - [v2ray](https://huggingface.co/v2ray/) for downloading, converting, and sharing it with the community [Mixtral-8x22B-v0.1](https://huggingface.co/v2ray/Mixtral-8x22B-v0.1)
98
- - [philschmid]([https://huggingface.co/philschmid) for the photo he shared on his Twitter
99
 
100
  β–„β–„β–„β–‘β–‘
101
  β–„β–„β–„β–„β–„β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘
 
24
 
25
  # Mixtral-8x22B-v0.1-GGUF
26
 
27
+ On April 10th, [@MistralAI](https://huggingface.co/mistralai) released a model named "Mixtral 8x22B," an 176B MoE via magnet link (torrent):
28
 
29
  - 176B MoE with ~40B active
30
  - Context length of 65k tokens
 
95
 
96
  - [MistralAI](https://huggingface.co/mistralai) for opening the weights
97
  - [v2ray](https://huggingface.co/v2ray/) for downloading, converting, and sharing it with the community [Mixtral-8x22B-v0.1](https://huggingface.co/v2ray/Mixtral-8x22B-v0.1)
98
+ - [philschmid](https://huggingface.co/philschmid) for the photo he shared on his Twitter
99
 
100
  β–„β–„β–„β–‘β–‘
101
  β–„β–„β–„β–„β–„β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘