GPTQ quantization of https://huggingface.co/TehVenom/Dolly_Malion-6b

Using this repository: https://github.com/mayaeary/GPTQ-for-LLaMa/tree/gptj-v2

Command:

python3 gptj.py models/Dolly_Malion-6b c4 --wbits 4 --groupsize 128 --save_safetensors models/Dolly_Malion-6b-4bit-128g.safetensors
Downloads last month
9
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.