runtime error

hared.so -- Installing: /tmp/tmpbza2tylt/wheel/platlib/lib/cmake/Llama/LlamaConfig.cmake -- Installing: /tmp/tmpbza2tylt/wheel/platlib/lib/cmake/Llama/LlamaConfigVersion.cmake -- Installing: /tmp/tmpbza2tylt/wheel/platlib/include/ggml.h -- Installing: /tmp/tmpbza2tylt/wheel/platlib/include/k_quants.h -- Installing: /tmp/tmpbza2tylt/wheel/platlib/lib/libllama.so -- Installing: /tmp/tmpbza2tylt/wheel/platlib/include/llama.h -- Installing: /tmp/tmpbza2tylt/wheel/platlib/bin/convert.py -- Installing: /tmp/tmpbza2tylt/wheel/platlib/bin/convert-lora-to-ggml.py -- Installing: /tmp/tmpbza2tylt/wheel/platlib/llama_cpp/libllama.so -- Installing: /tmp/pip-install-ofo16iqq/llama-cpp-python_e2b13956053e48178a516c7718600cc5/llama_cpp/libllama.so *** Making wheel... *** Created llama_cpp_python-0.2.11-cp310-cp310-linux_x86_64.whl... Building wheel for llama-cpp-python (pyproject.toml): finished with status 'done' Created wheel for llama-cpp-python: filename=llama_cpp_python-0.2.11-cp310-cp310-linux_x86_64.whl size=1011125 sha256=1e2081666b46418cc446eeca84a33edfef8d61fb4c99cd1af77d63767ab58d55 Stored in directory: /home/user/.cache/pip/wheels/f2/0b/35/554a45cbd976d5426b40faaf4bf8f24c9d3d08c1904508f455 Successfully built llama-cpp-python Installing collected packages: diskcache, llama-cpp-python Successfully installed diskcache-5.6.3 llama-cpp-python-0.2.11 [notice] A new release of pip available: 22.3.1 -> 24.0 [notice] To update, run: pip install --upgrade pip [nltk_data] Downloading package punkt to /home/user/nltk_data... [nltk_data] Unzipping tokenizers/punkt.zip. Loading Whisper ASR Traceback (most recent call last): File "/home/user/app/app.py", line 24, in <module> whisper_model = WhisperModel("large-v3", device="cuda", compute_type="float16") File "/usr/local/lib/python3.10/site-packages/faster_whisper/transcribe.py", line 133, in __init__ self.model = ctranslate2.models.Whisper( RuntimeError: CUDA failed with error CUDA driver version is insufficient for CUDA runtime version

Container logs:

Fetching error logs...