Text Generation
Transformers
Safetensors
English
llama
Inference Endpoints
text-generation-inference

Issues running the example in the model card

#4
by LittleFoxCode - opened

I install all the required dependencies and try to run the example, but constantly get this error:

OSError: stabilityai/StableBeluga-7B does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

It never seems to attempt to download the model and just hangs here. Any suggestions?

Sign up or log in to comment