## GPT4All 13B snoozy 4-bit From nomic-ai: https://huggingface.co/nomic-ai/gpt4all-13b-snoozy ### Folders **ggml:** q4_0 and q4_1 **gptq:** works with Triton and CUDA branches