ValueError: Could not load model HuggingFaceH4/starchat-beta with any of the following classes

#5
by hantianwei - opened

Traceback (most recent call last):
File "/Users/un/Downloads/starcoder-main/main.py", line 4, in
pipe = pipeline("text-generation", model="HuggingFaceH4/starchat-beta", torch_dtype=torch.bfloat16, device_map="auto")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/un/Downloads/starcoder-main/venv/lib/python3.11/site-packages/transformers/pipelines/init.py", line 779, in pipeline
framework, model = infer_framework_load_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/un/Downloads/starcoder-main/venv/lib/python3.11/site-packages/transformers/pipelines/base.py", line 271, in infer_framework_load_model
raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model HuggingFaceH4/starchat-beta with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt_bigcode.modeling_gpt_bigcode.GPTBigCodeForCausalLM'>).

Process finished with exit code 1

Hugging Face H4 org

Can you please update to the latest transformers version and try again?

I am encountering the same error. Were you able to find a way to fix it?

I am ending up with the same error. Is there a way around?

Sign up or log in to comment