Authorization header is correct, but the token seems invalid

#67
by Srisowmya - opened

raise ValueError(f"Error raised by inference API: {response['error']}")
ValueError: Error raised by inference API: Authorization header is correct, but the token seems invalid raise ValueError(f"Error raised by inference API: {response['error']}")
ValueError: Error raised by inference API: Authorization header is correct, but the token seems invalid

i am getting this error. how to resolve it

hi @Srisowmya
Can you give more details on how you are running the model

Hi, I am running the model in python and connected it with nodejs using spawn child process. When I am running individual python file , output is displayed but While running nodejs file i am encountering this issue.

Are you correctly passing your HF token ? A script like this should work I think:

import requests
import os

# Use your personal token here
TOKEN = os.environ.get("API_TOKEN")

API_URL = "https://api-inference.huggingface.co/models/google/flan-t5-xxl"
headers = {"Authorization": f"Bearer {TOKEN}"}

def query(payload):
    response = requests.post(API_URL, headers=headers, json=payload)
    return response.json()
    
output = query({
    "inputs": "The answer to the universe is",
})

Yeah, I am passing correctly. I am passing inference endpoint, headers as well

can you try with a new HF token? perhaps your token has expired for some reason

@Srisowmya Have you found the reason for this? I get the same error message and have no explanation why. I have recreated and tested several tokens. Thanks!

hello , @Srisowmya . I had actually a similair problem. I tried some tests on postman and the same error occured. What solved it for me was to change the permissions of the token to reader. i was trying the requests using the third option.

I create new token and changed it into Reader but got the same error.

hi did anyone solve this issue

Google org

@sanju9 @Ammar-Azman can you make sure your tokens are valid?

yes token is valid and also tried with new token

error is like
BadRequestError: (Request ID: ISYlQUYmzovFGpb4-yDln)

Bad request:
Authorization header is correct, but the token seems invalid

I fixed the issue, though not sure if my problem was the same as yours
my problem is the API_TOKEN is not passed into headers correctly

I create new token and changed it into Reader but got the same error.

I had exactly the same issue, anyone has solved the problem?

Hello everyone,
I also had the same error. Just now, I fixed it by creating a new token with 'Read' permissions. Try doing that to see if it works for you too.

@Srisowmya , In case your token is fine-grained, you should check "User Permissions" section in settings > Access Token and tick the check box(s) under "Inference" either check the - 1.Make calls to the serverless Inference API or option 2.Make calls to Inference Endpoints whichever applicable".
For fine grained tokens you will need to give permissions for each action/resources you want to use with that token.
Screenshot:
Screenshot from 2024-06-21 16-59-33.png

I'm still facing the same issue even after trying every solutions you guys provided.

image.png

P.S. "embeddings" works perfectly fine

same issue here. i don't think i'm doing something wrong, since the same code worked 2 months ago, anyone have anything new?

Hello everyone,
I also had the same error. Just now, I fixed it by creating a new token with 'Read' permissions. Try doing that to see if it works for you too.

I use your method, it works

What worked for me was changing the variable name from HF_token to HF in the .env file. Using HF_token was causing an error, but switching to HF resolved the issue. I'm not sure why this happens, but it does.

thank you so much @killingspree , that really means a lot.❤️

Hey, I found a solution that works for me. I am using JavaScript for learning.
Here's how it worked for me:

image.png

As shown in the image, the key thing that solved the problem for me is to explicitly define the model to be used as its failing to automatically choose a model.
Hope this works.

from langchain_community.llms import HuggingFaceHub

llm = HuggingFaceHub(
repo_id="microsoft/Phi-3-mini-4k-instruct",
task="text-generation",
model_kwargs={
"max_new_tokens": 512,
"top_k": 30,
"temperature": 0.1,
"repetition_penalty": 1.03,
},
)

llm.invoke("Hi")

getting this error

BadRequestError: (Request ID: uqw7hPzO2GIVKtgv5FY5Z)

Bad request:
Authorization header is correct, but the token seems invalid

@Yadandlakumar as far as the token is concerned. Did you try the steps mentioned by @ashique-desai ?
Because I was facing the same problem with the token and I followed the steps mentioned by @ashique-desai and it solved that problem for me.

Sign up or log in to comment