Getting "Could not load model" errors when using inference api.

#73
by breisa - opened

Why am I getting this error when using the inference api? My code used to work, I didn't change anything.

500 Internal Server Error: "{"error":"Could not load model facebook/bart-large-cnn with any of the following classes: (<class 'transformers.models.bart.modeling_bart.BartForConditionalGeneration'>, <class 'transformers.models.bart.modeling_tf_bart.TFBartForConditionalGeneration'>)."}"

This is a "500 Internal Server Error" . I just now tried loading, it's working. Re-try after some time.

Yeah, the issue still persists.

Confirm, the issue still persists

Could not load model facebook/bart-large-cnn with any of the following classes: (<class 'transformers.models.bart.modeling_bart.BartForConditionalGeneration'>, <class 'transformers.models.bart.modeling_tf_bart.TFBartForConditionalGeneration'>).

The above issue still persists.

Issue still persists Wed 3-27-24 9:51AM PDT

Screen Shot 2024-03-27 at 9.52.20 AM.png

Thanks for looking into this HF wizards!

Hi, Does anyone know how to fix the "Could not load model" error?

Sign up or log in to comment