auto-patch README.md
Browse files
README.md
CHANGED
@@ -18,6 +18,7 @@ tags:
|
|
18 |
weighted/imatrix quants of https://huggingface.co/terrycraddock/Reflection-Llama-3.1-8B
|
19 |
|
20 |
<!-- provided-files -->
|
|
|
21 |
## Usage
|
22 |
|
23 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
@@ -30,27 +31,27 @@ more details, including on how to concatenate multi-part files.
|
|
30 |
|
31 |
| Link | Type | Size/GB | Notes |
|
32 |
|:-----|:-----|--------:|:------|
|
33 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
34 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
35 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
36 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
37 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
38 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
39 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
40 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
41 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
42 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
43 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
44 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
45 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
46 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
47 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
48 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
49 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
50 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
51 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
52 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
53 |
-
| [PART 1](https://huggingface.co/mradermacher/
|
54 |
|
55 |
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
56 |
types (lower is better):
|
|
|
18 |
weighted/imatrix quants of https://huggingface.co/terrycraddock/Reflection-Llama-3.1-8B
|
19 |
|
20 |
<!-- provided-files -->
|
21 |
+
static quants are available at https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-GGUF
|
22 |
## Usage
|
23 |
|
24 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
|
|
31 |
|
32 |
| Link | Type | Size/GB | Notes |
|
33 |
|:-----|:-----|--------:|:------|
|
34 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ1_S.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ1_S.gguf) | i1-IQ1_S | 4.1 | for the desperate |
|
35 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ1_M.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ1_M.gguf) | i1-IQ1_M | 4.4 | mostly desperate |
|
36 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ2_XXS.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 4.9 | |
|
37 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ2_XS.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 5.3 | |
|
38 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ2_S.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ2_S.gguf) | i1-IQ2_S | 5.6 | |
|
39 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ2_M.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ2_M.gguf) | i1-IQ2_M | 6.0 | |
|
40 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q2_K.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q2_K.gguf) | i1-Q2_K | 6.5 | IQ3_XXS probably better |
|
41 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ3_XXS.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 6.6 | lower quality |
|
42 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ3_XS.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 7.1 | |
|
43 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q3_K_S.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 7.4 | IQ3_XS probably better |
|
44 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ3_S.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ3_S.gguf) | i1-IQ3_S | 7.5 | beats Q3_K* |
|
45 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ3_M.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ3_M.gguf) | i1-IQ3_M | 7.7 | |
|
46 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q3_K_M.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 8.1 | IQ3_S probably better |
|
47 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q3_K_L.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 8.7 | IQ3_M probably better |
|
48 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-IQ4_XS.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 9.0 | |
|
49 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q4_0.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q4_0.gguf) | i1-Q4_0 | 9.5 | fast, low quality |
|
50 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q4_K_S.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 9.5 | optimal size/speed/quality |
|
51 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q4_K_M.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 9.9 | fast, recommended |
|
52 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q5_K_S.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 11.3 | |
|
53 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q5_K_M.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 11.6 | |
|
54 |
+
| [PART 1](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/Reflection-Llama-3.1-8B.i1-Q6_K.gguf) [PART 2](https://huggingface.co/mradermacher/Reflection-Llama-3.1-8B-i1-GGUF/resolve/main/reflection-llama-3.1-8B.i1-Q6_K.gguf) | i1-Q6_K | 13.3 | practically like static Q6_K |
|
55 |
|
56 |
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
57 |
types (lower is better):
|