The ESMForMaskedLM module cannot be import from transformers

#1
by ThisIsSeaton - opened

Hi, the version of transformers package is 4.23.1, and when I try to import ESMForMaskedLM, ESMTokenizer, it fails.
It seems to exist a EsmForMaskedLM and EsmTokenizer in transformers package. I don't know whether it is the right one for ESM-b model.
Wish for your reply, Thank you, very much.

AI at Meta org

Hi @ThisIsSeaton ! I'm currently merging a large ESM PR and reuploading/moving some model checkpoints - we will have proper checkpoints for ESM-1b and ESM-2 very soon. Hopefully everything will work then!

Hi @ThisIsSeaton ! Sorry for the delay - we have proper checkpoints for ESM-1b, ESM-1v and ESM-2 now. ESM-1b now lives at facebook/esm1b_t33_650M_UR50S - your code should work if you use that checkpoint. If you'd rather use ESM-2 instead, you can also use any of the facebook/ checkpoints here: https://huggingface.co/models?sort=downloads&search=esm

Sad but these checkpoints still doesn't work.
For example, https://huggingface.co/facebook/esm2_t12_35M_UR50D/blob/main/tokenizer_config.json, "tokenizer_class": "EsmTokenizer"
As you see, the "tokenizer_class" is "EsmTokenizer", but it is still impossible to even import "EsmTokenizer".

It seems that the reason is your PR is not merged. (https://github.com/huggingface/transformers/pull/13662)
So we can't use modules like "ESMForMaskedLM, ESMTokenizer".
Even the Hosted inference API in your homepage(https://huggingface.co/facebook/esm-1b?text=The+goal+of+life+is+%3Cmask%3E) can't work now.

So can you solve this ASAP?
As you see, i'm quite in a hurry XD.

Sign up or log in to comment