Transformers
GGUF
English
mistral
Generated from Trainer
text-generation-inference

No tokenizer available?

#10
by dspyrhsu - opened

Hi there, I am using ctransformers and I am creating the GGUF-model like this:

model = AutoModelForCausalLM.from_pretrained(
"TheBloke/zephyr-7B-beta-GGUF",
hf=True)

Ater that, I would like to create the corresponding tokenizer like this:

tokenizer = AutoTokenizer.from_pretrained(model)

However, this gives me a "not implemented" error. How can I specify a tokenizer for this model?

Best regards and thanks for the great work!

Sign up or log in to comment