runtime error

the associated word embeddings are fine-tuned or trained. Traceback (most recent call last): File "/home/user/app/app.py", line 13, in <module> pipe = pipeline("text-generation", model=model, tokenizer=tokenizer) File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 905, in pipeline framework, model = infer_framework_load_model( File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 292, in infer_framework_load_model raise ValueError( ValueError: Could not load model nlux/CodeLlama-7b-hf_merge with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.llama.modeling_llama.LlamaForCausalLM'>). See the original errors: while loading with AutoModelForCausalLM, an error is thrown: Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 279, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3260, in from_pretrained raise EnvironmentError( OSError: nlux/CodeLlama-7b-hf_merge does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack. while loading with LlamaForCausalLM, an error is thrown: Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 279, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3260, in from_pretrained raise EnvironmentError( OSError: nlux/CodeLlama-7b-hf_merge does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Container logs:

Fetching error logs...