https://huggingface.co/jondurbin/bagel-dpo-20b-v04
#13
by
ayyylol
- opened
Would really appreciate the gguf quants for this. Thank you!
https://huggingface.co/jondurbin/bagel-dpo-20b-v04
It's in the queue now :)
Thank you!
That's the first InternLM model I quantize (I know, because my pipeline didn't handle it :). Unfortunately, the model also overflowed during imatrix generation, so the imatrix quants might be of lower quality.
Anyway, static quants are now at https://huggingface.co/mradermacher/bagel-dpo-20b-v04-GGUF and imatrix ones will be at https://huggingface.co/mradermacher/bagel-dpo-20b-v04-i1-GGUF within a day.
mradermacher
changed discussion status to
closed