Transformers
GGUF
English
Mixture of Experts
mixtral
openchat/openchat-3.5-0106
giux78/zefiro-7b-beta-ITA-v0.1
azale-ai/Starstreak-7b-beta
gagan3012/Mistral_arabic_dpo
davidkim205/komt-mistral-7b-v1
OpenBuddy/openbuddy-zephyr-7b-v14.1
manishiitg/open-aditi-hi-v1
VAGOsolutions/SauerkrautLM-7b-v1-mistral
Inference Endpoints
conversational
auto-patch README.md
Browse files
README.md
CHANGED
@@ -27,7 +27,7 @@ tags:
|
|
27 |
static quants of https://huggingface.co/gagan3012/Multilingual-mistral
|
28 |
|
29 |
<!-- provided-files -->
|
30 |
-
weighted/imatrix quants
|
31 |
## Usage
|
32 |
|
33 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
@@ -44,9 +44,11 @@ more details, including on how to concatenate multi-part files.
|
|
44 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q3_K_S.gguf) | Q3_K_S | 20.5 | |
|
45 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q3_K_M.gguf) | Q3_K_M | 22.6 | lower quality |
|
46 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q3_K_L.gguf) | Q3_K_L | 24.3 | |
|
|
|
47 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q4_K_S.gguf) | Q4_K_S | 26.8 | fast, recommended |
|
48 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q4_K_M.gguf) | Q4_K_M | 28.5 | fast, recommended |
|
49 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q5_K_S.gguf) | Q5_K_S | 32.3 | |
|
|
|
50 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q6_K.gguf) | Q6_K | 38.5 | very good quality |
|
51 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q8_0.gguf) | Q8_0 | 49.7 | fast, best quality |
|
52 |
|
|
|
27 |
static quants of https://huggingface.co/gagan3012/Multilingual-mistral
|
28 |
|
29 |
<!-- provided-files -->
|
30 |
+
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Multilingual-mistral-i1-GGUF
|
31 |
## Usage
|
32 |
|
33 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
|
|
44 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q3_K_S.gguf) | Q3_K_S | 20.5 | |
|
45 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q3_K_M.gguf) | Q3_K_M | 22.6 | lower quality |
|
46 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q3_K_L.gguf) | Q3_K_L | 24.3 | |
|
47 |
+
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.IQ4_XS.gguf) | IQ4_XS | 25.5 | |
|
48 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q4_K_S.gguf) | Q4_K_S | 26.8 | fast, recommended |
|
49 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q4_K_M.gguf) | Q4_K_M | 28.5 | fast, recommended |
|
50 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q5_K_S.gguf) | Q5_K_S | 32.3 | |
|
51 |
+
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q5_K_M.gguf) | Q5_K_M | 33.3 | |
|
52 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q6_K.gguf) | Q6_K | 38.5 | very good quality |
|
53 |
| [GGUF](https://huggingface.co/mradermacher/Multilingual-mistral-GGUF/resolve/main/Multilingual-mistral.Q8_0.gguf) | Q8_0 | 49.7 | fast, best quality |
|
54 |
|