🚩 Report
#106
by
tasmay
- opened
This inference API of this model is not working. "Model not loaded yet" error. Please resolve this.
Hey ... try passing the token also into the AutoTokenizer
, not only on AutoModelForCausalLM
:
AutoTokenizer.from_pretrained(model_id, token = '<your token>')
AutoModelForCausalLM.from_pretrained(model_id, token = '<your token>')