model doesn't work in the hosted interface api
#1
by
labx
- opened
this is the error that returns when trying to use the model on the api:
Can't load tokenizer using from_pretrained, please update its configuration: Tokenizer class NllbTokenizer does not exist or is not currently imported.
Hey @labx , NLLB models require the NllbTokenizer, as you have seen.
These are being contributed in this PR: https://github.com/huggingface/transformers/pull/18126
Please check it out and let us know in the PR comments if it works as expected for you!
Thanks.
@lysandre He is describing an error with the model's hosted inference API, not his code.
I don't know if this is still the error because it returns this error instead when I run it:
"Model facebook/nllb-200-3.3B time out"