error ValueError: This tokenizer cannot be instantiated. Please make sure you have `sentencepiece` installed in order to use this tokenizer.

#17
by HeMingYang - opened

截屏2023-10-22 17.29.14.png

截屏2023-10-22 17.27.02.png
ValueError: This tokenizer cannot be instantiated. Please make sure you have sentencepiece installed in order to use this tokenizer.

from transformers import pipeline, AutoTokenizer, AutoModelForSeq2SeqLM

#要使用该模型,需要安装sentencepiece
# !pip install sentencepiece

tokenizer = AutoTokenizer.from_pretrained("KennStack01/Helsinki-NLP-opus-mt-zh-en")
model = AutoModelForSeq2SeqLM.from_pretrained("KennStack01/Helsinki-NLP-opus-mt-zh-en")


translator = pipeline(task="translation_zh_to_en",
                      model=model,
                      tokenizer=tokenizer)

sentence = "我叫萨拉,我住在伦敦。"

translator(sentence, max_length=20)

Hello! After installing sentencepiece, please make sure that you have restarted the google colab runtime. Thanks!

Sign up or log in to comment