Inference Model with API and Integrate to LM (Language Model)

#1
by ridhoalattqas - opened

Is this model when i do inference with api, with this code :

headers = {"Authorization": f"Bearer {API_TOKEN}"}
API_URL = "https://api-inference.huggingface.co/models/ridhoalattqas/xlrs-best-lm"

def query(audio_bytes):
response = requests.request("POST", API_URL, headers=headers, data=audio_bytes)
return json.loads(response.content.decode("utf-8"))

is already connect with language_model sir ?

I am sorry, but how your question connects to this model?

Yehor changed discussion status to closed

i want to use the model with API inference without saving the model into my local drive, but when i try to inference the language_model (KenLM.arpa) is not linked with the inference model

ridhoalattqas changed discussion status to open

This question is for the Huggingface team, not for me

yeah i think you know how to do it but thanks for responding my question

ridhoalattqas changed discussion status to closed

Sign up or log in to comment