runtime error

config.json: 0%| | 0.00/31.0 [00:00<?, ?B/s] config.json: 100%|██████████| 31.0/31.0 [00:00<00:00, 191kB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 4, in <module> my_pipe = pipeline("text-generation", model="TheBloke/Yarn-Mistral-7B-128k-GGUF") File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 870, in pipeline framework, model = infer_framework_load_model( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 282, in infer_framework_load_model raise ValueError( ValueError: Could not load model TheBloke/Yarn-Mistral-7B-128k-GGUF with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>,). See the original errors: while loading with AutoModelForCausalLM, an error is thrown: Traceback (most recent call last): File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 269, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 566, in from_pretrained return model_class.from_pretrained( File "/home/user/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3321, in from_pretrained raise EnvironmentError( OSError: TheBloke/Yarn-Mistral-7B-128k-GGUF does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Container logs:

Fetching error logs...