problems with tokenizer because of sentencepiece is legacy

#3
by llyterson - opened

I am trying to run the example code, however I am having problems with the tokenizer.
The specific error is:

ValueError: This tokenizer cannot be instantiated. Please make sure you have sentencepiece installed in order to use this tokenizer.

The problem is that when I try to install sentencepiece, I realize that it is legacy.

Have you run into this problem? any suggestions?

Try to use python 3.9 instead @llyterson

Sign up or log in to comment