runtime error

4.31G/4.48G [00:52<00:02, 83.1MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 96%|█████████▋| 4.32G/4.48G [00:53<00:02, 71.2MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 97%|█████████▋| 4.34G/4.48G [00:53<00:01, 82.9MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 97%|█████████▋| 4.35G/4.48G [00:53<00:01, 72.0MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.37G/4.48G [00:53<00:01, 84.8MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.39G/4.48G [00:53<00:00, 102MB/s]  Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.41G/4.48G [00:53<00:00, 116MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 99%|█████████▉| 4.44G/4.48G [00:54<00:00, 119MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 99%|█████████▉| 4.46G/4.48G [00:54<00:00, 131MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 100%|█████████▉| 4.48G/4.48G [00:54<00:00, 146MB/s] Downloading (…)7d5fc9263bc9fca8bdb1: 100%|██████████| 4.48G/4.48G [00:54<00:00, 82.5MB/s] Downloading shards: 100%|██████████| 2/2 [02:49<00:00, 79.39s/it] Downloading shards: 100%|██████████| 2/2 [02:49<00:00, 84.73s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 21, in <module> pipeline = transformers.pipeline( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline framework, model = infer_framework_load_model( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 279, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model tiiuae/falcon-7b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>,).

Container logs:

Fetching error logs...