access to model is restricted, even though I have been approved?

#132
by Kelmeilia - opened

I am sorry, surely there is a reasonable explanation for this, but I just don't get it ...

I have been granted access to the model:

image.png

But unfortunately I still can't access it through the example script:

(with hf_token in pipeline construction)

OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B.
401 Client Error. (Request ID: Root=1-663fb5bd-7d28f4447bfd337e03d1d7e1;9595c68e-90d4-4f08-bbd9-096df0c5ddcb)

Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.

without the hf_token in transformers.pipeline:

What am I missing?

edit: I am providing my hf token in the pipeline creation

model_id = "meta-llama/Meta-Llama-3-8B"

pipeline = transformers.pipeline(
"text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto", use_auth_token=hf_token
)

the same :(

Same issue here

Yes, same issue.

I have solved the issue. I login HuggingFace in terminal of vscode, and transmit the parameter "use_auth_token=True" to model and tokenizer, then it success.

Thanks for the help!

It seems, that I needed to do one more step;
I had not accessed gated models before, so setting the HF_HUB_TOKEN environment variable and aforementioned use_auth_token=True wasn't still enough - It was needed to run

huggingface-cli login

and providing the generated updated api key as well to there...

Kelmeilia changed discussion status to closed

Sign up or log in to comment