license: apache-2.0 | |
GPTQ 4-bit no actor version for compatibility that works in textgen-webui | |
Generated by using scripts from https://gitee.com/yhyu13/llama_-tools | |
Original weight : https://huggingface.co/project-baize/baize-v2-7b | |
Baize is a lora training framework that allows fine-tuning LLaMA models on commondity GPUs. | |
Checkout my 7B baize gptq 4bit here : https://huggingface.co/Yhyu13/baize-v2-7b-gptq-4bit |