Spaces:
MyNLPApp
Runtime error

runtime error

Space not ready. Reason: Error, exitCode: 1, message: None

Container logs:

Downloading:   0%|          | 0.00/62.0 [00:00<?, ?B/s]
Downloading: 100%|██████████| 62.0/62.0 [00:00<00:00, 76.1kB/s]

Downloading:   0%|          | 0.00/836 [00:00<?, ?B/s]
Downloading: 100%|██████████| 836/836 [00:00<00:00, 1.19MB/s]

Downloading:   0%|          | 0.00/1.80M [00:00<?, ?B/s]
Downloading:  91%|█████████ | 1.63M/1.80M [00:00<00:00, 17.0MB/s]
Downloading: 100%|██████████| 1.80M/1.80M [00:00<00:00, 18.2MB/s]

Downloading:   0%|          | 0.00/156 [00:00<?, ?B/s]
Downloading: 100%|██████████| 156/156 [00:00<00:00, 194kB/s]
Traceback (most recent call last):
  File "app.py", line 4, in <module>
    tokenizer = AutoTokenizer.from_pretrained("m3hrdadfi/albert-fa-base-v2-sentiment-snappfood")
  File "/home/user/.local/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 573, in from_pretrained
    return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
  File "/home/user/.local/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1784, in from_pretrained
    return cls._from_pretrained(
  File "/home/user/.local/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1929, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
  File "/home/user/.local/lib/python3.8/site-packages/transformers/models/albert/tokenization_albert_fast.py", line 148, in __init__
    super().__init__(
  File "/home/user/.local/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 118, in __init__
    raise ValueError(
ValueError: Couldn't instantiate the backend tokenizer from one of: 
(1) a `tokenizers` library serialization file, 
(2) a slow tokenizer instance to convert or 
(3) an equivalent slow tokenizer class to instantiate and convert. 
You need to have sentencepiece installed to convert a slow tokenizer to a fast one.