runtime error
)64bf0b3f24acb095824c: 98%|█████████▊| 5.58G/5.68G [01:27<00:01, 72.9MB/s] Downloading (…)64bf0b3f24acb095824c: 98%|█████████▊| 5.59G/5.68G [01:27<00:01, 56.3MB/s] Downloading (…)64bf0b3f24acb095824c: 99%|█████████▊| 5.61G/5.68G [01:27<00:01, 65.2MB/s] Downloading (…)64bf0b3f24acb095824c: 99%|█████████▉| 5.62G/5.68G [01:28<00:01, 59.1MB/s] Downloading (…)64bf0b3f24acb095824c: 99%|█████████▉| 5.64G/5.68G [01:28<00:00, 51.2MB/s] Downloading (…)64bf0b3f24acb095824c: 99%|█████████▉| 5.65G/5.68G [01:28<00:00, 50.8MB/s] Downloading (…)64bf0b3f24acb095824c: 100%|█████████▉| 5.66G/5.68G [01:29<00:00, 44.4MB/s] Downloading (…)64bf0b3f24acb095824c: 100%|█████████▉| 5.67G/5.68G [01:29<00:00, 37.8MB/s] Downloading (…)64bf0b3f24acb095824c: 100%|█████████▉| 5.68G/5.68G [01:29<00:00, 39.5MB/s] Downloading (…)64bf0b3f24acb095824c: 100%|██████████| 5.68G/5.68G [01:29<00:00, 63.4MB/s] The argument `trust_remote_code` is to be used with Auto classes. It has no effect here and is ignored. Traceback (most recent call last): File "app.py", line 5, in <module> generate_text = pipeline(model="databricks/dolly-v2-3b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto") File "/home/user/.local/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 724, in pipeline framework, model = infer_framework_load_model( File "/home/user/.local/lib/python3.8/site-packages/transformers/pipelines/base.py", line 266, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model databricks/dolly-v2-3b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt_neox.modeling_gpt_neox.GPTNeoXForCausalLM'>).
Container logs:
Fetching error logs...