runtime error

ading (…)64bf0b3f24acb095824c: 98%|█████████▊| 5.57G/5.68G [01:32<00:01, 70.0MB/s] Downloading (…)64bf0b3f24acb095824c: 98%|█████████▊| 5.58G/5.68G [01:32<00:01, 69.7MB/s] Downloading (…)64bf0b3f24acb095824c: 98%|█████████▊| 5.59G/5.68G [01:33<00:01, 49.0MB/s] Downloading (…)64bf0b3f24acb095824c: 99%|█████████▊| 5.61G/5.68G [01:33<00:01, 51.1MB/s] Downloading (…)64bf0b3f24acb095824c: 99%|█████████▉| 5.62G/5.68G [01:34<00:01, 43.0MB/s] Downloading (…)64bf0b3f24acb095824c: 99%|█████████▉| 5.64G/5.68G [01:34<00:00, 46.9MB/s] Downloading (…)64bf0b3f24acb095824c: 99%|█████████▉| 5.65G/5.68G [01:34<00:00, 49.1MB/s] Downloading (…)64bf0b3f24acb095824c: 100%|█████████▉| 5.67G/5.68G [01:34<00:00, 59.5MB/s] Downloading (…)64bf0b3f24acb095824c: 100%|█████████▉| 5.68G/5.68G [01:35<00:00, 58.1MB/s] Downloading (…)64bf0b3f24acb095824c: 100%|██████████| 5.68G/5.68G [01:35<00:00, 59.8MB/s] The argument `trust_remote_code` is to be used with Auto classes. It has no effect here and is ignored. Traceback (most recent call last): File "app.py", line 4, in <module> ans = pipeline(model="databricks/dolly-v2-3b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto") File "/home/user/.local/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 754, in pipeline framework, model = infer_framework_load_model( File "/home/user/.local/lib/python3.8/site-packages/transformers/pipelines/base.py", line 266, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model databricks/dolly-v2-3b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt_neox.modeling_gpt_neox.GPTNeoXForCausalLM'>).

Container logs:

Fetching error logs...