auto-patch README.md
Browse files
README.md
CHANGED
@@ -10,8 +10,9 @@ quantized_by: mradermacher
|
|
10 |
## About
|
11 |
|
12 |
weighted/imatrix quants of https://huggingface.co/quantumaikr/falcon-180B-WizardLM_Orca
|
13 |
-
<!-- provided-files -->
|
14 |
|
|
|
|
|
15 |
## Usage
|
16 |
|
17 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
@@ -24,7 +25,7 @@ more details, including on how to concatenate multi-part files.
|
|
24 |
|
25 |
| Link | Type | Size/GB | Notes |
|
26 |
|:-----|:-----|--------:|:------|
|
27 |
-
| [GGUF](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ1_S.gguf) | i1-IQ1_S | 37.4 |
|
28 |
| [GGUF](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 46.8 | |
|
29 |
| [PART 1](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ2_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ2_M.gguf.part2of2) | i1-IQ2_M | 60.3 | |
|
30 |
| [PART 1](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ3_XXS.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ3_XXS.gguf.part2of2) | i1-IQ3_XXS | 68.5 | fast, lower quality |
|
|
|
10 |
## About
|
11 |
|
12 |
weighted/imatrix quants of https://huggingface.co/quantumaikr/falcon-180B-WizardLM_Orca
|
|
|
13 |
|
14 |
+
<!-- provided-files -->
|
15 |
+
static quants are available at https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-GGUF
|
16 |
## Usage
|
17 |
|
18 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
|
|
25 |
|
26 |
| Link | Type | Size/GB | Notes |
|
27 |
|:-----|:-----|--------:|:------|
|
28 |
+
| [GGUF](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ1_S.gguf) | i1-IQ1_S | 37.4 | for the desperate |
|
29 |
| [GGUF](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 46.8 | |
|
30 |
| [PART 1](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ2_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ2_M.gguf.part2of2) | i1-IQ2_M | 60.3 | |
|
31 |
| [PART 1](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ3_XXS.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/falcon-180B-WizardLM_Orca-i1-GGUF/resolve/main/falcon-180B-WizardLM_Orca.i1-IQ3_XXS.gguf.part2of2) | i1-IQ3_XXS | 68.5 | fast, lower quality |
|