runtime error

Exit code: 1. Reason: config.json: 0%| | 0.00/812 [00:00<?, ?B/s] config.json: 100%|██████████| 812/812 [00:00<00:00, 4.20MB/s] `torch_dtype` is deprecated! Use `dtype` instead! model.safetensors: 0%| | 0.00/538M [00:00<?, ?B/s] model.safetensors: 0%| | 1.41M/538M [00:01<06:54, 1.29MB/s] model.safetensors: 25%|██▌ | 136M/538M [00:02<00:07, 54.8MB/s]  model.safetensors: 100%|██████████| 538M/538M [00:03<00:00, 142MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 604, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 288, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 5176, in from_pretrained ) = cls._load_pretrained_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 5597, in _load_pretrained_model caching_allocator_warmup(model_to_load, expanded_device_map, hf_quantizer) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 6215, in caching_allocator_warmup index = device.index if device.index is not None else torch_accelerator_module.current_device() File "/usr/local/lib/python3.10/site-packages/torch/cuda/__init__.py", line 1071, in current_device _lazy_init() File "/usr/local/lib/python3.10/site-packages/torch/cuda/__init__.py", line 412, in _lazy_init torch._C._cuda_init() RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx

Container logs:

Fetching error logs...