runtime error
Exit code: 1. Reason: .11/site-packages/vllm/transformers_utils/config.py", line 201, in get_config ERROR 08-04 09:24:47 engine.py:389] raise ValueError(f"No supported config format found in {model}") ERROR 08-04 09:24:47 engine.py:389] ValueError: No supported config format found in convergence-ai/proxy-lite-3b Process SpawnProcess-1: Traceback (most recent call last): File "/usr/local/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap self.run() File "/usr/local/lib/python3.11/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/usr/local/lib/python3.11/site-packages/vllm/engine/multiprocessing/engine.py", line 391, in run_mp_engine raise e File "/usr/local/lib/python3.11/site-packages/vllm/engine/multiprocessing/engine.py", line 380, in run_mp_engine engine = MQLLMEngine.from_engine_args(engine_args=engine_args, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/vllm/engine/multiprocessing/engine.py", line 118, in from_engine_args engine_config = engine_args.create_engine_config(usage_context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 1075, in create_engine_config model_config = self.create_model_config() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 998, in create_model_config return ModelConfig( ^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/vllm/config.py", line 302, in __init__ hf_config = get_config(self.model, trust_remote_code, revision, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/vllm/transformers_utils/config.py", line 201, in get_config raise ValueError(f"No supported config format found in {model}") ValueError: No supported config format found in convergence-ai/proxy-lite-3b
Container logs:
Fetching error logs...