license: mit | |
EXL2 quants of [GPT2](https://huggingface.co/openai-community/gpt2) | |
[3.00 bits per weight](https://huggingface.co/turboderp/gpt2-exl2/tree/3.0bpw) | |
[4.00 bits per weight](https://huggingface.co/turboderp/gpt2-exl2/tree/4.0bpw) | |
[6.00 bits per weight](https://huggingface.co/turboderp/gpt2-exl2/tree/6.0bpw) | |
[8.00 bits per weight](https://huggingface.co/turboderp/gpt2-exl2/tree/8.0bpw) | |
[measurement.json](https://huggingface.co/turboderp/gpt2-exl2/blob/main/measurement.json) |