Upload README.md
Browse files
README.md
CHANGED
@@ -96,7 +96,9 @@ All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches
|
|
96 |
| [main](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/main) | 4 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 17.69 GB | Yes | 4-bit, with Act Order. No group size, to lower VRAM requirements. |
|
97 |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 18.33 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
|
98 |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 20.28 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
|
99 |
-
| [gptq-3bit-128g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/gptq-3bit-128g-actorder_True) | 3 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 14.14 GB | No | 3-bit, with group size 128g and act-order. Higher quality than 128g-False. |
|
|
|
|
|
100 |
|
101 |
<!-- README_GPTQ.md-provided-files end -->
|
102 |
|
@@ -355,9 +357,9 @@ __*I strongly suggest adding stopping criteria/early inference stopping on "USER
|
|
355 |
|
356 |
### Fine-tuning details
|
357 |
|
358 |
-
https://gist.github.com/jondurbin/
|
359 |
|
360 |
-
*Note: I used checkpoint
|
361 |
|
362 |
### Helpful usage tips
|
363 |
|
|
|
96 |
| [main](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/main) | 4 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 17.69 GB | Yes | 4-bit, with Act Order. No group size, to lower VRAM requirements. |
|
97 |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 18.33 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
|
98 |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 20.28 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
|
99 |
+
| [gptq-3bit-128g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/gptq-3bit-128g-actorder_True) | 3 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 14.14 GB | No | 3-bit, with group size 128g and act-order. Higher quality than 128g-False. |
|
100 |
+
| [gptq-3bit-32g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/gptq-3bit-32g-actorder_True) | 3 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 15.99 GB | No | 3-bit, with group size 64g and act-order. Highest quality 3-bit option. |
|
101 |
+
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-c34b-2.2-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 34.30 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
|
102 |
|
103 |
<!-- README_GPTQ.md-provided-files end -->
|
104 |
|
|
|
357 |
|
358 |
### Fine-tuning details
|
359 |
|
360 |
+
https://gist.github.com/jondurbin/51a336c582a224de197ba1d2c6b1da97
|
361 |
|
362 |
+
*Note: I used checkpoint 750 for final model!*
|
363 |
|
364 |
### Helpful usage tips
|
365 |
|