runtime error

ympy->torch->flash-attn) (1.3.0) Building wheels for collected packages: flash-attn Building wheel for flash-attn (setup.py): started Building wheel for flash-attn (setup.py): finished with status 'done' Created wheel for flash-attn: filename=flash_attn-2.6.2-py3-none-any.whl size=187219564 sha256=37891ca4085bb30d0de1a4b174803962ba64a17cd6850a7969bb1c32fe35379f Stored in directory: /home/user/.cache/pip/wheels/ff/8c/8d/5e5f2312258ec81ff5d5c32978910e2474c33556579d8004e2 Successfully built flash-attn Installing collected packages: einops, flash-attn Successfully installed einops-0.8.0 flash-attn-2.6.2 [notice] A new release of pip available: 22.3.1 -> 24.1.2 [notice] To update, run: /usr/local/bin/python -m pip install --upgrade pip Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Traceback (most recent call last): File "/home/user/app/app.py", line 75, in <module> model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 521, in from_pretrained config, kwargs = AutoConfig.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1135, in from_pretrained return config_class.from_dict(config_dict, **unused_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 763, in from_dict config = cls(**config_dict) File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/configuration_llama.py", line 160, in __init__ self._rope_scaling_validation() File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/configuration_llama.py", line 180, in _rope_scaling_validation raise ValueError( ValueError: `rope_scaling` must be a dictionary with with two fields, `type` and `factor`, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}

Container logs:

Fetching error logs...