Baize V2 7B 4-bit
From project-baize: https://huggingface.co/project-baize/baize-v2-7b
Folders
ggml: q4_0 and q4_1
gptq: works with Triton and CUDA branches
From project-baize: https://huggingface.co/project-baize/baize-v2-7b
ggml: q4_0 and q4_1
gptq: works with Triton and CUDA branches