https://huggingface.co/fblgit/UNA-ThePitbull-21.4-v1
Sounds interesting, would like this added to the queue
https://huggingface.co/fblgit/UNA-ThePitbull-21.4-v1
Absolutely, but it's not currently supported by llama.cpp, see https://huggingface.co/mradermacher/model_requests/discussions/69#6653418bb34bbdaec8fcc5cd
I could possibly hack the model to make it convert, but I would prefer to have an actual upstream repo with this workaround, as it's likely not an issue with the model itself.
I think bartowksi did just that (he usually doesn't write what he is doing), so you can use his quants, even if they are probably not correct.
Hmm, he did change quite a few tokens. I'll try to see what happens if only the U+0000 token is changed.
I misunderstood what he did, it seems he only changed the u+0000 token. Should be all good then.