GPTQ and GGUF conversion
#6
by
nonetrix
- opened
It would be really nice if someone converted this model into other formats for GPU and efficient CPU based inference, also some instructions on how to use it inside textgenui etc. Would be neat
hey, i am currently working with a few guys just that. will keep you updated, ps you can reach out to me on my linkedIn, and i can keep you updated.
I don't personally use LinkedIn but thanks
SnypzZz
changed discussion status to
closed
Has this been resolved? Why was this closed? I can't find GPTQ or GGUF model on HF
Any updates?