Text Generation
Transformers
Safetensors
English
stablelm
causal-lm
conversational
Eval Results
Inference Endpoints

Tokenizer.model

#6
by HoangHa - opened

I saw the best practice for thebloke to convert a model to GGUF format is to have a tokenizer.model but I don't see it here so it may cause problems like the gguf model unable to predict the end token. Can you add it here? Or how can I get it?

Thank you in advanced.

Sign up or log in to comment