Spaces:
No application file
No application file
2024-09-30 08:49:49 | INFO | model_worker | args: Namespace(host='0.0.0.0', port=40000, worker_address='http://localhost:40000', controller_address='http://localhost:20001', model_path='/home/jack/Projects/yixin-llm/merge_med_llava_3', model_base=None, model_name=None, device='cuda', multi_modal=False, limit_model_concurrency=5, stream_interval=1, no_register=False, load_8bit=False, load_4bit=False) | |
2024-09-30 08:49:49 | INFO | model_worker | Loading the model merge_med_llava_3 on worker c80683 ... | |
2024-09-30 08:49:49 | WARNING | transformers.models.llama.tokenization_llama | You are using the legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This means that tokens that come after special tokens will not be properly handled. We recommend you to read the related pull request available at https://github.com/huggingface/transformers/pull/24565 | |
2024-09-30 08:49:50 | ERROR | stderr | /home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. | |
2024-09-30 08:49:50 | ERROR | stderr | warnings.warn( | |
2024-09-30 08:49:50 | ERROR | stderr | Loading checkpoint shards: 0%| | 0/2 | |
2024-09-30 08:49:57 | ERROR | stderr | Loading checkpoint shards: 50%|βββββββββββββββββββββββββββββββββββββββββββ | 1/2 | |
2024-09-30 08:49:59 | ERROR | stderr | Loading checkpoint shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 2/2 | |
2024-09-30 08:49:59 | ERROR | stderr | Loading checkpoint shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 2/2 | |
2024-09-30 08:49:59 | ERROR | stderr | | |
2024-09-30 08:50:01 | INFO | model_worker | Register to controller | |
2024-09-30 08:50:01 | ERROR | stderr | [32mINFO[0m: Started server process | |
2024-09-30 08:50:01 | ERROR | stderr | [32mINFO[0m: Waiting for application startup. | |
2024-09-30 08:50:01 | ERROR | stderr | [32mINFO[0m: Application startup complete. | |
2024-09-30 08:50:01 | ERROR | stderr | [32mINFO[0m: Uvicorn running on [1mhttp://0.0.0.0:40000[0m (Press CTRL+C to quit) | |
2024-09-30 08:50:16 | INFO | model_worker | Send heart beat. Models: | |
2024-09-30 08:55:20 | ERROR | stderr | Traceback (most recent call last): | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/runpy.py", line 196, in _run_module_as_main | |
2024-09-30 08:55:20 | ERROR | stderr | return _run_code(code, main_globals, None, | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/runpy.py", line 86, in _run_code | |
2024-09-30 08:55:20 | ERROR | stderr | exec(code, run_globals) | |
2024-09-30 08:55:20 | ERROR | stderr | File "/data1/jackdata/yixin-llm-data/yptests/MMedAgent_demo/llava/serve/model_worker.py", line 285, in <module> | |
2024-09-30 08:55:20 | ERROR | stderr | uvicorn.run(app, host=args.host, port=args.port, log_level="info") | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/uvicorn/main.py", line 575, in run | |
2024-09-30 08:55:20 | ERROR | stderr | server.run() | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/uvicorn/server.py", line 65, in run | |
2024-09-30 08:55:20 | ERROR | stderr | return asyncio.run(self.serve(sockets=sockets)) | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/asyncio/runners.py", line 44, in run | |
2024-09-30 08:55:20 | ERROR | stderr | return loop.run_until_complete(main) | |
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 1511, in uvloop.loop.Loop.run_until_complete | |
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 1504, in uvloop.loop.Loop.run_until_complete | |
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 1377, in uvloop.loop.Loop.run_forever | |
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 555, in uvloop.loop.Loop._run | |
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 474, in uvloop.loop.Loop._on_idle | |
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/cbhandles.pyx", line 83, in uvloop.loop.Handle._run | |
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/cbhandles.pyx", line 63, in uvloop.loop.Handle._run | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/uvicorn/server.py", line 68, in serve | |
2024-09-30 08:55:20 | ERROR | stderr | with self.capture_signals(): | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/contextlib.py", line 142, in __exit__ | |
2024-09-30 08:55:20 | ERROR | stderr | next(self.gen) | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/uvicorn/server.py", line 328, in capture_signals | |
2024-09-30 08:55:20 | ERROR | stderr | signal.raise_signal(captured_signal) | |
2024-09-30 08:55:20 | ERROR | stderr | KeyboardInterrupt | |
2024-09-30 08:55:20 | ERROR | stderr | Exception ignored in: <module 'threading' from '/home/jack/anaconda3/envs/llavaplus/lib/python3.10/threading.py'> | |
2024-09-30 08:55:20 | ERROR | stderr | Traceback (most recent call last): | |
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/threading.py", line 1567, in _shutdown | |
2024-09-30 08:55:20 | ERROR | stderr | lock.acquire() | |
2024-09-30 08:55:20 | ERROR | stderr | KeyboardInterrupt: | |