Receiving error using the example from model card

#4
by cozydive77 - opened

Hello,

I hope this is the right place posting this, if not please excuse me I am pretty new here.
I have used the exact example python code from the model card to get started testig Alfred. After downloading the model files I receive the following error running the script:

Traceback (most recent call last):
File "Alfred-40B-0723.py", line 8, in
pipeline = transformers.pipeline(
File "C:\Program Files\Python311\Lib\site-packages\transformers\pipelines_init_.py", line 788, in pipeline
framework, model = infer_framework_load_model(
File "C:\Program Files\Python311\Lib\site-packages\transformers\pipelines\base.py", line 278, in infer_framework_load_model
raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model lightonai/alfred-40b-0723 with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.auto.modeling_tf_auto.TFAutoModelForCausalLM'>).

This is with pyhton 3.11 on Windows. I had a look on a few other models from HF so far (e.g. whisper medium) which worked fine. Could someone please give me a hint how to solve this? I have googled the error, however, couldn't find a solution so far. Hints I've found were related to pytorch wich is installed and updated. Is there a workaround without the AutoModelForCausalLM? Thank you very much for your support!

Best regards,
Matthias

Sign up or log in to comment