Transformers
GGUF
English
Inference Endpoints
mradermacher commited on
Commit
44ecf3e
1 Parent(s): 2ee63ad

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -73,7 +73,7 @@ more details, including on how to concatenate multi-part files.
73
  | Link | Type | Size/GB | Notes |
74
  |:-----|:-----|--------:|:------|
75
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ1_S.gguf) | i1-IQ1_S | 5.4 | for the desperate |
76
- | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ1_M.gguf) | i1-IQ1_M | 5.8 | for the desperate |
77
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 6.4 | |
78
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ2_XS.gguf) | i1-IQ2_XS | 6.9 | |
79
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ2_S.gguf) | i1-IQ2_S | 7.3 | |
@@ -94,7 +94,6 @@ more details, including on how to concatenate multi-part files.
94
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-Q5_K_M.gguf) | i1-Q5_K_M | 14.8 | |
95
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-Q6_K.gguf) | i1-Q6_K | 17.1 | practically like static Q6_K |
96
 
97
-
98
  Here is a handy graph by ikawrakow comparing some lower-quality quant
99
  types (lower is better):
100
 
 
73
  | Link | Type | Size/GB | Notes |
74
  |:-----|:-----|--------:|:------|
75
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ1_S.gguf) | i1-IQ1_S | 5.4 | for the desperate |
76
+ | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ1_M.gguf) | i1-IQ1_M | 5.8 | mostly desperate |
77
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 6.4 | |
78
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ2_XS.gguf) | i1-IQ2_XS | 6.9 | |
79
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-IQ2_S.gguf) | i1-IQ2_S | 7.3 | |
 
94
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-Q5_K_M.gguf) | i1-Q5_K_M | 14.8 | |
95
  | [GGUF](https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF/resolve/main/bagel-dpo-20b-v04.i1-Q6_K.gguf) | i1-Q6_K | 17.1 | practically like static Q6_K |
96
 
 
97
  Here is a handy graph by ikawrakow comparing some lower-quality quant
98
  types (lower is better):
99