FP16 version?

#1
by Nexesenex - opened

Hey Brandon,

airoboros-33b-gpt4-1.4.1-NTK-16384-fp16 should be fully usable with proper GGML quantization now, and I'd be curious to convert it in GGUF to play with it, and compare it with the LXCTX-PI which is still one of my daily use model with Airoboros C34 2.1. But the FP16 link gives a 404. Could you share it again, please?

Sign up or log in to comment