falcon-7b-autogptq / README.md
yhyu13
Upload
6bec0fd
---
license: apache-2.0
---
Genearted from https://github.com/yhyu13/AutoGPTQ.git branch cuda_dev
Original weight: https://huggingface.co/tiiuae/falcon-7b
Note this is the quantization of the base model, where base model is not fined-tuned with chat instructions yet