Error - Checkout your internet connection

#2
by vladak - opened

I created my API access_token token and when I try to run:

Load model directly

from transformers import AutoModel
model = AutoModel.from_pretrained("EvolutionaryScale/esm3-sm-open-v1", token=access_token)

I get an error:
We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like EvolutionaryScale/esm3-sm-open-v1 is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

Any suggestions?

How to deploy this model in sagemaker as an endpoint and doinference. do we have any examples of payload

I don't think AutoModel is going to pick up the weights because of where the files are located. Can you run their example code ( https://github.com/evolutionaryscale/esm/blob/main/examples/generate.ipynb ) locally or on Google CoLab (removing cuda references if you don't have a local gpu)? Also the same folder has other examples which maybe could answer your inference question.

yes locally it works now i want to expose this as an endpoint . do we have some sample code what the payload look like .

The issue was because I created the authorisation token with custom privilegies instead of "read". When I set token to read the error was gone.
Yes, it should be fetched with:

ESM3.from_pretrained("esm3_sm_open_v1")
vladak changed discussion status to closed

Sign up or log in to comment