mradermacher commited on
Commit
7d471cb
1 Parent(s): 2b7cd97

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -13,8 +13,9 @@ tags:
13
  ## About
14
 
15
  weighted/imatrix quants of https://huggingface.co/wolfram/miquliz-120b-v2.0
16
- <!-- provided-files -->
17
 
 
 
18
  ## Usage
19
 
20
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
@@ -27,7 +28,7 @@ more details, including on how to concatenate multi-part files.
27
 
28
  | Link | Type | Size/GB | Notes |
29
  |:-----|:-----|--------:|:------|
30
- | [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ1_S.gguf) | i1-IQ1_S | 25.7 | |
31
  | [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 32.2 | |
32
  | [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ2_XS.gguf) | i1-IQ2_XS | 35.8 | |
33
  | [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q2_K.gguf) | i1-Q2_K | 44.6 | IQ3_XXS probably better |
 
13
  ## About
14
 
15
  weighted/imatrix quants of https://huggingface.co/wolfram/miquliz-120b-v2.0
 
16
 
17
+ <!-- provided-files -->
18
+ static quants are available at https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF
19
  ## Usage
20
 
21
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
 
28
 
29
  | Link | Type | Size/GB | Notes |
30
  |:-----|:-----|--------:|:------|
31
+ | [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ1_S.gguf) | i1-IQ1_S | 25.7 | for the desperate |
32
  | [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 32.2 | |
33
  | [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ2_XS.gguf) | i1-IQ2_XS | 35.8 | |
34
  | [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q2_K.gguf) | i1-Q2_K | 44.6 | IQ3_XXS probably better |