runtime error

red.so -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/lib/cmake/Llama/LlamaConfig.cmake -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/lib/cmake/Llama/LlamaConfigVersion.cmake -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/include/ggml.h -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/include/k_quants.h -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/lib/libllama.so -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/include/llama.h -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/bin/convert.py -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/bin/convert-lora-to-ggml.py -- Installing: /tmp/tmpmpg7m0vi/wheel/platlib/llama_cpp/libllama.so -- Installing: /tmp/pip-install-sbncsj5u/llama-cpp-python_5b3c6170d0864023b6579f921ec76c7f/llama_cpp/libllama.so *** Making wheel... *** Created llama_cpp_python-0.2.11-cp310-cp310-linux_x86_64.whl... Building wheel for llama-cpp-python (pyproject.toml): finished with status 'done' Created wheel for llama-cpp-python: filename=llama_cpp_python-0.2.11-cp310-cp310-linux_x86_64.whl size=1011129 sha256=b96715ed726bcc1c6aff9e7eb4c79257faa3b31665c9af6e2a2f07a22a6b34f5 Stored in directory: /home/user/.cache/pip/wheels/f2/0b/35/554a45cbd976d5426b40faaf4bf8f24c9d3d08c1904508f455 Successfully built llama-cpp-python Installing collected packages: diskcache, llama-cpp-python Successfully installed diskcache-5.6.3 llama-cpp-python-0.2.11 [notice] A new release of pip available: 22.3.1 -> 24.1.1 [notice] To update, run: pip install --upgrade pip [nltk_data] Downloading package punkt to /home/user/nltk_data... [nltk_data] Unzipping tokenizers/punkt.zip. Loading Whisper ASR Traceback (most recent call last): File "/home/user/app/app.py", line 26, in <module> whisper_model = WhisperModel("large-v3", device="cuda", compute_type="float16") File "/usr/local/lib/python3.10/site-packages/faster_whisper/transcribe.py", line 133, in __init__ self.model = ctranslate2.models.Whisper( RuntimeError: CUDA failed with error CUDA driver version is insufficient for CUDA runtime version

Container logs:

Fetching error logs...