access to model is restricted, even though I have been approved?
I am sorry, surely there is a reasonable explanation for this, but I just don't get it ...
I have been granted access to the model:
But unfortunately I still can't access it through the example script:
(with hf_token in pipeline construction)
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B.
401 Client Error. (Request ID: Root=1-663fb5bd-7d28f4447bfd337e03d1d7e1;9595c68e-90d4-4f08-bbd9-096df0c5ddcb)Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.without the hf_token in transformers.pipeline:
What am I missing?
edit: I am providing my hf token in the pipeline creation
model_id = "meta-llama/Meta-Llama-3-8B"
pipeline = transformers.pipeline(
"text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto", use_auth_token=hf_token
)
the same :(
Same issue here
Yes, same issue.
I have solved the issue. I login HuggingFace in terminal of vscode, and transmit the parameter "use_auth_token=True" to model and tokenizer, then it success.
Thanks for the help!
It seems, that I needed to do one more step;
I had not accessed gated models before, so setting the HF_HUB_TOKEN environment variable and aforementioned use_auth_token=True wasn't still enough - It was needed to run
huggingface-cli login
and providing the generated updated api key as well to there...
I have solved the issue. I login HuggingFace in terminal of vscode, and transmit the parameter "use_auth_token=True" to model and tokenizer, then it success.
Can you explain how?
add token = your_token to your automodel instantiation
I had similar issues as above. What resolved the issue for me was the following comment from here: https://github.com/huggingface/diffusers/issues/6223. I created a Read access token and then it started working.
When i generated token i created wrong type
Create a new access token Type Read instead of default Fine-grained (custom)
this one worked for me.tokenizer = AutoTokenizer.from_pretrained( "meta-llama/Meta-Llama-3.1-8B-Instruct", token=HF_TOKEN )
in mine it says like this
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3.1-8B.
403 Client Error. (Request ID: Root=1-66bde90e-307a3c0419da184f621cea2a;56d907ca-6175-4ec2-88c8-0ec3a17003d0)
Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3.1-8B/resolve/main/config.json.
Your request to access model meta-llama/Meta-Llama-3.1-8B is awaiting a review from the repo authors.
very sad cant access this
Mine is having the same issue, i have set my use_auth_token=token,
[9fcfbfddrl] Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct/resolve/b2a4d0f33b41fcd59a6d31662cc63b8d53367e1e/config.json.
[9fcfbfddrl] Access to model meta-llama/Meta-Llama-3.1-8B-Instruct is restricted. You must be authenticated to access it.