Edit model card

This model is the 4bit quantized version of the alpaca-7b model.

Created with the https://github.com/oobabooga/GPTQ-for-LLaMa repository for better compatibility to text-generation-webui.

Use with --wbits 4 and --groupsize 128

Downloads last month
1