request
This model was already queued and failed in the past. I will check why it failed in the past after work.
and there's something tricky. I don't get why. I was able to GGUF one quanti on GGUF my Repo, but then when I tried to load on my CLI straight from HF, the only GGUF never finished the downloading. But when I manually dl my GGUF, it loaded normally.
here's the link for the only GGUF I converted, maybe it can shed some light:
https://huggingface.co/RezVortex/JAJUKA-WEWILLNEVERFORGETYOU-3B-Q4_K_S-GGUF
oh, i see, I made a slightly different merge too, if you could quanti it too i'll be very greateful.
it's this one:
https://huggingface.co/RezVortex/Jajuka-3b
Have a great day, week. And many cheers for your work.
https://huggingface.co/RezVortex/JAJUKA-WEWILLNEVERFORGETYOU-3B
OK I forcefully queued it. Let's see if it works. I would assume not but we will see.
You can check for progress at http://hf.tst.eu/status.html or regularly check the model
summary page at https://hf.tst.eu/model#JAJUKA-WEWILLNEVERFORGETYOU-3B-GGUF for quants to appear.
It's queued! :D
You can check for progress at http://hf.tst.eu/status.html or regularly check the model
summary page at https://hf.tst.eu/model#Jajuka-3b-GGUF for quants to appear.