How to load the xenova/whisper-base model

#1
by bsmani - opened

I am trying to load the this model in using transformers pipeline method but its give me this error :
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/base.py", line 283, in infer_framework_load_model
model = model_class.from_pretrained(model, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_tf_utils.py", line 2877, in from_pretrained
raise EnvironmentError(
OSError: Xenova/whisper-base does not appear to have a file named pytorch_model.bin, tf_model.h5 or model.ckpt

please help me and thanks..

bsmani changed discussion title from How to load the xenova/whisper-base model model to How to load the xenova/whisper-base model
Owner

Hi there - this checkpoint is not compatible with the python library since it uses ONNX weights and not PyTorch weights. If you want to use the original checkpoint from python, you can try https://huggingface.co/openai/whisper-base.

thanks but i want to webgpu openai/whisper-base give that option or not

Owner

In that case, you can check the source code for the demo I made here: https://github.com/xenova/transformers.js/blob/v3/examples/webgpu-whisper/src/worker.js

Sign up or log in to comment