runtime error

Space failed. Exit code: 1. Reason: s. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. 2023-08-18 21:15:03.833028: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. 2023-08-18 21:15:04.586446: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT All TF 2.0 model weights were used when initializing BartForConditionalGeneration. All the weights of BartForConditionalGeneration were initialized from the TF 2.0 model. If your task is similar to the task the model of the checkpoint was trained on, you can already use BartForConditionalGeneration for predictions without further training. Downloading (…)neration_config.json: 0%| | 0.00/292 [00:00<?, ?B/s] Downloading (…)neration_config.json: 100%|██████████| 292/292 [00:00<00:00, 297kB/s] Caching examples at: '/home/user/app/gradio_cached_examples/15' Traceback (most recent call last): File "app.py", line 18, in <module> gr.Interface(generate, inputs = input_component, outputs=output_component, examples=examples, title = "👨🏻‍🎤 ChatGPT Prompt Generator v12 👨🏻‍🎤", description=description).launch() File "/home/user/.local/lib/python3.8/site-packages/gradio/interface.py", line 456, in __init__ self.render_article() File "/home/user/.local/lib/python3.8/site-packages/gradio/blocks.py", line 1200, in __exit__ self.config = self.get_config_file() File "/home/user/.local/lib/python3.8/site-packages/gradio/blocks.py", line 1176, in get_config_file "input": list(block.input_api_info()), # type: ignore File "/home/user/.local/lib/python3.8/site-packages/gradio_client/serializing.py", line 41, in input_api_info return (api_info["serialized_input"][0], api_info["serialized_input"][1]) KeyError: 'serialized_input'

Container logs:

Fetching error logs...