runtime error
Exit code: 1. Reason: 77.5MB/s][A tf_model.h5: 11%|β | 179M/1.63G [00:02<00:19, 75.5MB/s] [A tf_model.h5: 45%|βββββ | 724M/1.63G [00:03<00:03, 269MB/s] [A tf_model.h5: 100%|ββββββββββ| 1.63G/1.63G [00:04<00:00, 379MB/s] Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/transformers/activations_tf.py", line 22, in <module> import tf_keras as keras ModuleNotFoundError: No module named 'tf_keras' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 5, in <module> model = AutoModelForSeq2SeqLM.from_pretrained("merve/chatgpt-prompts-bart-long", from_tf=True) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 600, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 317, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 5059, in from_pretrained model, loading_info = cls._load_from_tf(model, config, checkpoint_files) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 5660, in _load_from_tf model, loading_info = load_tf2_checkpoint_in_pytorch_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_tf_pytorch_utils.py", line 518, in load_tf2_checkpoint_in_pytorch_model from .modeling_tf_utils import load_tf_weights File "/usr/local/lib/python3.10/site-packages/transformers/modeling_tf_utils.py", line 38, in <module> from .activations_tf import get_tf_activation File "/usr/local/lib/python3.10/site-packages/transformers/activations_tf.py", line 27, in <module> raise ValueError( ValueError: Your currently installed version of Keras is Keras 3, but this is not yet supported in Transformers. Please install the backwards-compatible tf-keras package with `pip install tf-keras`.
Container logs:
Fetching error logs...