Cannot download the model locally

#1
by Dipto084 - opened

Hello,

I am facing an issue regarding downloading and using the model. This is the code snippet I am using -

from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('castorini/repllama-v1-mrl-7b-lora-passage', token = True)
model = AutoModel.from_pretrained('castorini/repllama-v1-mrl-7b-lora-passage').to('cuda')

and getting this error -
huggingface_hub.errors.LocalTokenNotFoundError: Token is required (token=True), but no token found. You need to provide a token or be logged in to Hugging Face with huggingface-cli login or huggingface_hub.login. See https://huggingface.co/settings/tokens.

Any suggestions how to solve this?

Castorini org

Hi

I think (token=True) is not necessary here.
Or you may want to login huggingface with huggingface-cli login

Thanks for the response. I tried huggingface-cli login but still it doesn't seem to work.

Sign up or log in to comment