create OPTTokenizer without AutoTokenizer.from_pretrained

#6
by dhansmair - opened

Hello there!
I'm not sure if this is the right place to ask this. On the machine I am working on I can't use AutoTokenizer.from_pretrained(), because it gives an Exception: "error: no locks available". This is a local problem on my side, but I am wondering if there is an alternative way to instantiate the tokenizer, like an OPTTokenizer class. For example, for GPT-2 there is a GPT2Tokenizer which works on the machine.
Thanks a lot for your help and for providing this awesome LM!
Best, David

I figured out it's as simple as using GPT2Tokenizer.from_pretrained("facebook/opt-30b")

Hey @dhansmair , indeed, the OPT checkpoints leverage the GPT2Tokenizer so loading it as in your last comment should work!

If you'd like help regarding the first issue (error: no locks available), please do open an issue on the transformers repository and we'll be happy to investigate with you.

lysandre changed discussion status to closed

Sign up or log in to comment