I can't use the model , its showing an error

#7
by Luciferalive - opened

Error: Couldn't instantiate the backend tokenizer from one of:
(1) a tokenizers library serialization file,
(2) a slow tokenizer instance to convert or
(3) an equivalent slow tokenizer class to instantiate and convert.
You need to have sentencepiece installed to convert a slow tokenizer to a fast one.

I am installed the sentence peice, its still showing the error.

Credit Mutuel Arkea org

Hello Luciferalive,

I am unable to reproduce your error. Could you please execute this code and provide a copy-paste of the error you encounter?

from transformers import AutoTokenizer
token = AutoTokenizer.from_pretrained("cmarkea/distilcamembert-base-sentiment")
Cyrile changed discussion status to closed

Sign up or log in to comment