runtime error

red.so -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/lib/cmake/Llama/LlamaConfig.cmake -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/lib/cmake/Llama/LlamaConfigVersion.cmake -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/include/ggml.h -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/include/k_quants.h -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/lib/libllama.so -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/include/llama.h -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/bin/convert.py -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/bin/convert-lora-to-ggml.py -- Installing: /tmp/tmprg8r6dqn/wheel/platlib/llama_cpp/libllama.so -- Installing: /tmp/pip-install-6hg57t7s/llama-cpp-python_8ee095ed86b14c9293ee984df2be8ef6/llama_cpp/libllama.so *** Making wheel... *** Created llama_cpp_python-0.2.11-cp310-cp310-linux_x86_64.whl... Building wheel for llama-cpp-python (pyproject.toml): finished with status 'done' Created wheel for llama-cpp-python: filename=llama_cpp_python-0.2.11-cp310-cp310-linux_x86_64.whl size=1011128 sha256=33392d16715f15b9d0f6a197198aed30e3a38d7b6f9536b9319e19c5535dccb5 Stored in directory: /home/user/.cache/pip/wheels/f2/0b/35/554a45cbd976d5426b40faaf4bf8f24c9d3d08c1904508f455 Successfully built llama-cpp-python Installing collected packages: diskcache, llama-cpp-python Successfully installed diskcache-5.6.3 llama-cpp-python-0.2.11 [notice] A new release of pip available: 22.3.1 -> 24.1.1 [notice] To update, run: pip install --upgrade pip [nltk_data] Downloading package punkt to /home/user/nltk_data... [nltk_data] Unzipping tokenizers/punkt.zip. Loading Whisper ASR Traceback (most recent call last): File "/home/user/app/app.py", line 45, in <module> whisper_model = WhisperModel("large-v3", device="cuda", compute_type="float16") File "/usr/local/lib/python3.10/site-packages/faster_whisper/transcribe.py", line 133, in __init__ self.model = ctranslate2.models.Whisper( RuntimeError: CUDA failed with error CUDA driver version is insufficient for CUDA runtime version

Container logs:

Fetching error logs...