runtime error

The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. 0it [00:00, ?it/s] 0it [00:00, ?it/s] /usr/local/lib/python3.10/site-packages/diffusers/models/transformers/transformer_2d.py:34: FutureWarning: `Transformer2DModelOutput` is deprecated and will be removed in version 1.0.0. Importing `Transformer2DModelOutput` from `diffusers.models.transformer_2d` is deprecated and this will be removed in a future version. Please use `from diffusers.models.modeling_outputs import Transformer2DModelOutput`, instead. deprecate("Transformer2DModelOutput", "1.0.0", deprecation_message) The token has not been saved to the git credentials helper. Pass `add_to_git_credential=True` in this function directly or `--add-to-git-credential` if using via `huggingface-cli` if you want to set the git credential as well. Traceback (most recent call last): File "/home/user/app/app.py", line 24, in <module> login(token=HF_API_TOKEN) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/_login.py", line 111, in login _login(token, add_to_git_credential=add_to_git_credential, write_permission=write_permission) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/_login.py", line 307, in _login raise ValueError("Invalid token passed!") ValueError: Invalid token passed!

Container logs:

Fetching error logs...