Create config.json
#7
by
standingout
- opened
No description provided.
What's the benefit?
Hi. I am sorry for the confusion. I was trying to load the tokenizer and the model with transformers:
model_id = "bartowski/Meta-Llama-3-8B-Instruct-GGUF"
filename = "Meta-Llama-3-8B-Instruct-Q8_0.gguf"
tokenizer = AutoTokenizer.from_pretrained(model_id, gguf_file=filename)
model = AutoModelForCausalLM.from_pretrained(model_id, gguf_file=filename)
but when I try to load the tokenizer, I get the error saying:TypeError: expected str, bytes or os.PathLike object, not NoneType
Could you please upload the tokenizer config too?
This is a GGUF repo, you can't load it with Transformers, either use the original safetensors with Transformers or these files with llamacpp
bartowski
changed pull request status to
closed