runtime error

| 83.9M/117M [00:00<00:00, 109MB/s] Downloading (…)7ae67a2c00c715e04ef2: 90%|████████▉ | 105M/117M [00:00<00:00, 108MB/s] Downloading (…)7ae67a2c00c715e04ef2: 100%|██████████| 117M/117M [00:01<00:00, 91.2MB/s] Downloading (…)7ae67a2c00c715e04ef2: 100%|██████████| 117M/117M [00:01<00:00, 105MB/s] Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation. Running with the wolves that came out to tear apart the trees, that we're afraid of, that we'll bury our children in the fields, That this world will not find out for us, That these traitors from the wolf-oil fields might not find me. -Green Day Traceback (most recent call last): File "app.py", line 32, in <module> iface = gr.Interface(fn=generate_captioned_img, inputs=[gr.Textbox(value="Running with the wolves", label="Lyrics prompt", lines=1), File "/home/user/.local/lib/python3.8/site-packages/gradio/interface.py", line 426, in __init__ cache_interface_examples(self) File "/home/user/.local/lib/python3.8/site-packages/gradio/process_examples.py", line 51, in cache_interface_examples raise e File "/home/user/.local/lib/python3.8/site-packages/gradio/process_examples.py", line 47, in cache_interface_examples prediction = process_example(interface, example_id) File "/home/user/.local/lib/python3.8/site-packages/gradio/process_examples.py", line 29, in process_example prediction = interface.process(raw_input) File "/home/user/.local/lib/python3.8/site-packages/gradio/interface.py", line 754, in process predictions = self.run_prediction(processed_input) File "/home/user/.local/lib/python3.8/site-packages/gradio/interface.py", line 718, in run_prediction prediction = predict_fn(*processed_input) File "app.py", line 19, in generate_captioned_img wrapped_text = wrap_text(generated_text) File "/home/user/app/text_utils.py", line 6, in wrap_text quote, author = generated_text.split("-") ValueError: too many values to unpack (expected 2)

Container logs:

Fetching error logs...