Not able to load the model

#2
by varun500 - opened

ValueError Traceback (most recent call last)
/tmp/ipykernel_27/1542641557.py in
1 from transformers import AutoTokenizer, AutoModelForCausalLM
2
----> 3 tokenizer = AutoTokenizer.from_pretrained("4bit/alpaca-7b-native-4bit")
4
5 model = AutoModelForCausalLM.from_pretrained("4bit/alpaca-7b-native-4bit")

/opt/conda/lib/python3.7/site-packages/transformers/models/auto/tokenization_auto.py in from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs)
675 if tokenizer_class is None:
676 raise ValueError(
--> 677 f"Tokenizer class {tokenizer_class_candidate} does not exist or is not currently imported."
678 )
679 return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)

ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported.

I don't think this is my model being referenced here, but regardless the error is most likely that you have a very out of date version of transformers from before the Lllama PR was officially merged.

Sign up or log in to comment