Tokenizer Error

#2
by Saptarshi7 - opened

When I try to load the tokenizer like this

from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('axiong/PMC_LLaMA_13B')

I'm getting the following error,

RecursionError: maximum recursion depth exceeded while getting the str of an object

I also checked and there is no tokenizer.json file associated with this model. Is this an error?

@axiong any update? I do encounter the same issue.

Sign up or log in to comment