runtime error

oading (…)l-00003-of-00003.bin: 99%|█████████▉| 6.54G/6.61G [01:45<00:01, 66.3MB/s] Downloading (…)l-00003-of-00003.bin: 99%|█████████▉| 6.55G/6.61G [01:45<00:00, 73.6MB/s] Downloading (…)l-00003-of-00003.bin: 99%|█████████▉| 6.56G/6.61G [01:45<00:00, 65.5MB/s] Downloading (…)l-00003-of-00003.bin: 99%|█████████▉| 6.57G/6.61G [01:45<00:00, 70.1MB/s] Downloading (…)l-00003-of-00003.bin: 100%|█████████▉| 6.59G/6.61G [01:45<00:00, 73.9MB/s] Downloading (…)l-00003-of-00003.bin: 100%|█████████▉| 6.60G/6.61G [01:45<00:00, 69.3MB/s] Downloading (…)l-00003-of-00003.bin: 100%|█████████▉| 6.61G/6.61G [01:46<00:00, 30.6MB/s] Downloading (…)l-00003-of-00003.bin: 100%|██████████| 6.61G/6.61G [01:46<00:00, 62.0MB/s] Downloading shards: 100%|██████████| 3/3 [07:18<00:00, 138.62s/it] Downloading shards: 100%|██████████| 3/3 [07:18<00:00, 146.05s/it] Traceback (most recent call last): File "/home/user/app/web_demo_old.py", line 11, in <module> model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan-13B-Chat", device_map="auto", torch_dtype=torch.float16, trust_remote_code=True) File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 466, in from_pretrained return model_class.from_pretrained( File "/home/user/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2646, in from_pretrained ) = cls._load_pretrained_model( File "/home/user/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2740, in _load_pretrained_model raise ValueError( ValueError: The current `device_map` had weights offloaded to the disk. Please provide an `offload_folder` for them. Alternatively, make sure you have `safetensors` installed if the model you are using offers the weights in this format.

Container logs:

Fetching error logs...