runtime error
Exit code: 1. Reason: tokenizer_config.json: 0%| | 0.00/6.77k [00:00<?, ?B/s][A tokenizer_config.json: 100%|██████████| 6.77k/6.77k [00:00<00:00, 41.6MB/s] tokenizer.json: 0%| | 0.00/11.4M [00:00<?, ?B/s][A tokenizer.json: 100%|██████████| 11.4M/11.4M [00:00<00:00, 103MB/s] special_tokens_map.json: 0%| | 0.00/485 [00:00<?, ?B/s][A special_tokens_map.json: 100%|██████████| 485/485 [00:00<00:00, 3.73MB/s] config.json: 0%| | 0.00/702 [00:00<?, ?B/s][A config.json: 100%|██████████| 702/702 [00:00<00:00, 6.20MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto") File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 573, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 272, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4317, in from_pretrained checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1130, in _get_resolved_checkpoint_files raise EnvironmentError( OSError: Walelign/Maths_fine-tuned-deepseek-r1-1.5b does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
Container logs:
Fetching error logs...