mayaeary's picture
Update README.md
73754ac
|
raw
history blame
451 Bytes
---
license: apache-2.0
tags:
- text generation
- conversational
- gptq
- 4bit
inference: false
language:
- en
pipeline_tag: text-generation
---
GPTQ quantization of https://huggingface.co/KoboldAI/PPO_Pygway-6b-Mix
Using this repository: https://github.com/mayaeary/GPTQ-for-LLaMa/tree/gptj-v2
Command:
```
python3 gptj.py models/PPO_Pygway-6b-Mix c4 --wbits 4 --groupsize 128 --save_safetensors models/PPO_Pygway-6b-Mix-4bit-128g.safetensors
```