runtime error
Exit code: 1. Reason: VIDIA Container Toolkit to start this container with GPU support; see https://docs.nvidia.com/datacenter/cloud-native/ . Traceback (most recent call last): File "/opt/conda/bin/f5-tts_infer-gradio", line 5, in <module> from f5_tts.infer.infer_gradio import main File "/workspace/F5-TTS/src/f5_tts/infer/infer_gradio.py", line 32, in <module> from f5_tts.model import DiT, UNetT File "/workspace/F5-TTS/src/f5_tts/model/__init__.py", line 1, in <module> from f5_tts.model.cfm import CFM File "/workspace/F5-TTS/src/f5_tts/model/cfm.py", line 21, in <module> from f5_tts.model.modules import MelSpec File "/workspace/F5-TTS/src/f5_tts/model/modules.py", line 18, in <module> from librosa.filters import mel as librosa_mel_fn File "/opt/conda/lib/python3.11/site-packages/librosa/filters.py", line 59, in <module> from .core.convert import note_to_hz, hz_to_midi, midi_to_hz, hz_to_octs File "/opt/conda/lib/python3.11/site-packages/librosa/core/convert.py", line 7, in <module> from . import notation File "/opt/conda/lib/python3.11/site-packages/librosa/core/notation.py", line 1008, in <module> @jit(nopython=True, nogil=True, cache=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/numba/core/decorators.py", line 225, in wrapper disp.enable_caching() File "/opt/conda/lib/python3.11/site-packages/numba/core/dispatcher.py", line 808, in enable_caching self._cache = FunctionCache(self.py_func) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/numba/core/caching.py", line 601, in __init__ self._impl = self._impl_class(py_func) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/numba/core/caching.py", line 337, in __init__ raise RuntimeError("cannot cache function %r: no locator available " RuntimeError: cannot cache function '__o_fold': no locator available for file '/opt/conda/lib/python3.11/site-packages/librosa/core/notation.py'
Container logs:
Fetching error logs...