GUFF for CausalLM/35b-beta-long?

#1
by Elfrino - opened

@0-hero @munish0838

Hello guys,

I was just wondering if it's possible to get some GGUFs for this model?:

https://huggingface.co/CausalLM/35b-beta-long

It appears to be one of the few fine-tuned Command-R 35b models, showing promising test results.

Thankyou in advance.

Quant Factory org

Sorry about the late reply. As of now Command R conversion doesn't seem to work after the llama.cpp BPE update. Will do this once a fix is released

Now worries.

I think they merged a fix yesterday. :)

https://github.com/ggerganov/llama.cpp/pull/7063

Quant Factory org

They will be up here in a few mins - QuantFactory/CausalLM-35b-beta-long-GGUF

They will be up here in a few mins - QuantFactory/CausalLM-35b-beta-long-GGUF

Awesome, thankyou! :)

Sign up or log in to comment