|
--- |
|
base_model: xai-org/grok-1 |
|
language: |
|
- en |
|
library_name: transformers |
|
license: apache-2.0 |
|
quantized_by: mradermacher |
|
tags: |
|
- grok-1 |
|
--- |
|
## About |
|
|
|
<!-- ### quantize_version: 2 --> |
|
<!-- ### output_tensor_quantised: 1 --> |
|
<!-- ### convert_type: hf --> |
|
<!-- ### vocab_type: --> |
|
<!-- ### tags: --> |
|
static quants of https://huggingface.co/xai-org/grok-1 |
|
|
|
<!-- provided-files --> |
|
weighted/imatrix quants are available at https://huggingface.co/mradermacher/grok-1-i1-GGUF |
|
## Usage |
|
|
|
If you are unsure how to use GGUF files, refer to one of [TheBloke's |
|
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for |
|
more details, including on how to concatenate multi-part files. |
|
|
|
## Provided Quants |
|
|
|
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) |
|
|
|
| Link | Type | Size/GB | Notes | |
|
|:-----|:-----|--------:|:------| |
|
| [PART 1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q2_K.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q2_K.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q2_K.gguf.part3of3) | Q2_K | 116.3 | | |
|
| [PART 1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_S.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_S.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_S.gguf.part3of3) | Q3_K_S | 137.6 | | |
|
| [PART 1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_M.gguf.part1of4) [PART 2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_M.gguf.part2of4) [PART 3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_M.gguf.part3of4) [PART 4](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_M.gguf.part4of4) | Q3_K_M | 152.1 | lower quality | |
|
| [PART 1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_L.gguf.part1of4) [PART 2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_L.gguf.part2of4) [PART 3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_L.gguf.part3of4) [PART 4](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q3_K_L.gguf.part4of4) | Q3_K_L | 163.5 | | |
|
| [PART 1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q4_K_S.gguf.part1of4) [PART 2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q4_K_S.gguf.part2of4) [PART 3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q4_K_S.gguf.part3of4) [PART 4](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q4_K_S.gguf.part4of4) | Q4_K_S | 180.7 | fast, recommended | |
|
| [PART 1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q4_K_M.gguf.part1of4) [PART 2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q4_K_M.gguf.part2of4) [PART 3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q4_K_M.gguf.part3of4) [PART 4](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q4_K_M.gguf.part4of4) | Q4_K_M | 192.3 | fast, recommended | |
|
| [P1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_S.gguf.part1of5) [P2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_S.gguf.part2of5) [P3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_S.gguf.part3of5) [P4](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_S.gguf.part4of5) [P5](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_S.gguf.part5of5) | Q5_K_S | 218.1 | | |
|
| [P1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_M.gguf.part1of5) [P2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_M.gguf.part2of5) [P3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_M.gguf.part3of5) [P4](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_M.gguf.part4of5) [P5](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q5_K_M.gguf.part5of5) | Q5_K_M | 225.0 | | |
|
| [P1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q6_K.gguf.part1of6) [P2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q6_K.gguf.part2of6) [P3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q6_K.gguf.part3of6) [P4](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q6_K.gguf.part4of6) [P5](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q6_K.gguf.part5of6) [P6](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q6_K.gguf.part6of6) | Q6_K | 259.9 | very good quality | |
|
| [P1](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q8_0.gguf.part1of7) [P2](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q8_0.gguf.part2of7) [P3](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q8_0.gguf.part3of7) [P4](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q8_0.gguf.part4of7) [P5](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q8_0.gguf.part5of7) [P6](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q8_0.gguf.part6of7) [P7](https://huggingface.co/mradermacher/grok-1-GGUF/resolve/main/grok-1.Q8_0.gguf.part7of7) | Q8_0 | 336.4 | fast, best quality | |
|
|
|
Here is a handy graph by ikawrakow comparing some lower-quality quant |
|
types (lower is better): |
|
|
|
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) |
|
|
|
And here are Artefact2's thoughts on the matter: |
|
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 |
|
|
|
## FAQ / Model Request |
|
|
|
See https://huggingface.co/mradermacher/model_requests for some answers to |
|
questions you might have and/or if you want some other model quantized. |
|
|
|
## Thanks |
|
|
|
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting |
|
me use its servers and providing upgrades to my workstation to enable |
|
this work in my free time. |
|
|
|
<!-- end --> |
|
|