Upload folder using huggingface_hub
Browse filesThis view is limited to 50 files because it contains too many changes. See raw diff
- test/dashboard.err +0 -0
- test/dashboard.log +88 -0
- test/dashboard.out +0 -0
- test/dashboard_DataHead.err +0 -0
- test/dashboard_DataHead.log +5 -0
- test/dashboard_DataHead.out +0 -0
- test/dashboard_EventHead.err +0 -0
- test/dashboard_EventHead.log +6 -0
- test/dashboard_EventHead.out +0 -0
- test/dashboard_JobHead.err +0 -0
- test/dashboard_JobHead.log +5 -0
- test/dashboard_JobHead.out +0 -0
- test/dashboard_MetricsHead.err +0 -0
- test/dashboard_MetricsHead.log +5 -0
- test/dashboard_MetricsHead.out +0 -0
- test/dashboard_NodeHead.err +0 -0
- test/dashboard_NodeHead.log +7 -0
- test/dashboard_NodeHead.out +0 -0
- test/dashboard_ReportHead.err +0 -0
- test/dashboard_ReportHead.log +5 -0
- test/dashboard_ReportHead.out +0 -0
- test/dashboard_ServeHead.err +0 -0
- test/dashboard_ServeHead.log +5 -0
- test/dashboard_ServeHead.out +0 -0
- test/dashboard_StateHead.err +0 -0
- test/dashboard_StateHead.log +5 -0
- test/dashboard_StateHead.out +0 -0
- test/dashboard_TrainHead.err +0 -0
- test/dashboard_TrainHead.log +5 -0
- test/dashboard_TrainHead.out +0 -0
- test/dashboard_agent.err +0 -0
- test/dashboard_agent.log +77 -0
- test/dashboard_agent.out +0 -0
- test/debug_state.txt +199 -0
- test/events/event_AUTOSCALER.log +0 -0
- test/events/event_CORE_WORKER_1384.log +0 -0
- test/events/event_CORE_WORKER_1922.log +0 -0
- test/events/event_CORE_WORKER_1923.log +0 -0
- test/events/event_CORE_WORKER_1924.log +0 -0
- test/events/event_CORE_WORKER_1925.log +0 -0
- test/events/event_CORE_WORKER_1926.log +0 -0
- test/events/event_CORE_WORKER_1927.log +0 -0
- test/events/event_CORE_WORKER_1928.log +0 -0
- test/events/event_CORE_WORKER_1929.log +0 -0
- test/events/event_CORE_WORKER_2410.log +1 -0
- test/events/event_GCS.log +0 -0
- test/events/event_RAYLET.log +0 -0
- test/export_events/event_EXPORT_ACTOR.log +0 -0
- test/export_events/event_EXPORT_DRIVER_JOB.log +0 -0
- test/export_events/event_EXPORT_NODE.log +0 -0
test/dashboard.err
ADDED
|
File without changes
|
test/dashboard.log
ADDED
|
@@ -0,0 +1,88 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:44,994 INFO utils.py:307 -- Get all modules by type: DashboardHeadModule
|
| 2 |
+
2026-02-27 00:03:45,535 INFO utils.py:340 -- Available modules: [<class 'ray.dashboard.modules.usage_stats.usage_stats_head.UsageStatsHead'>]
|
| 3 |
+
2026-02-27 00:03:45,535 INFO head.py:235 -- DashboardHeadModules to load: None.
|
| 4 |
+
2026-02-27 00:03:45,536 INFO head.py:238 -- Loading DashboardHeadModule: <class 'ray.dashboard.modules.usage_stats.usage_stats_head.UsageStatsHead'>.
|
| 5 |
+
2026-02-27 00:03:45,536 INFO head.py:242 -- Loaded 1 dashboard head modules: [<ray.dashboard.modules.usage_stats.usage_stats_head.UsageStatsHead object at 0x7713805d69c0>].
|
| 6 |
+
2026-02-27 00:03:45,536 INFO utils.py:307 -- Get all modules by type: SubprocessModule
|
| 7 |
+
2026-02-27 00:03:45,538 INFO utils.py:340 -- Available modules: [<class 'ray.dashboard.modules.metrics.metrics_head.MetricsHead'>, <class 'ray.dashboard.modules.data.data_head.DataHead'>, <class 'ray.dashboard.modules.event.event_head.EventHead'>, <class 'ray.dashboard.modules.job.job_head.JobHead'>, <class 'ray.dashboard.modules.node.node_head.NodeHead'>, <class 'ray.dashboard.modules.reporter.reporter_head.ReportHead'>, <class 'ray.dashboard.modules.serve.serve_head.ServeHead'>, <class 'ray.dashboard.modules.state.state_head.StateHead'>, <class 'ray.dashboard.modules.train.train_head.TrainHead'>]
|
| 8 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.metrics.metrics_head.MetricsHead'>.
|
| 9 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.data.data_head.DataHead'>.
|
| 10 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.event.event_head.EventHead'>.
|
| 11 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.job.job_head.JobHead'>.
|
| 12 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.node.node_head.NodeHead'>.
|
| 13 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.reporter.reporter_head.ReportHead'>.
|
| 14 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.serve.serve_head.ServeHead'>.
|
| 15 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.state.state_head.StateHead'>.
|
| 16 |
+
2026-02-27 00:03:45,539 INFO head.py:292 -- Loading SubprocessModule: <class 'ray.dashboard.modules.train.train_head.TrainHead'>.
|
| 17 |
+
2026-02-27 00:03:45,539 INFO head.py:296 -- Loaded 9 subprocess modules: [<ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x7713808333e0>, <ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x77137bbb7140>, <ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x771381022330>, <ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x77137dbd2ba0>, <ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x7713749466c0>, <ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x771378141700>, <ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x771373b37020>, <ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x7713749eba10>, <ray.dashboard.subprocesses.handle.SubprocessModuleHandle object at 0x7713749e8aa0>].
|
| 18 |
+
2026-02-27 00:03:47,104 INFO head.py:311 -- Starting dashboard metrics server on port 44227
|
| 19 |
+
2026-02-27 00:03:47,110 INFO head.py:435 -- Initialize the http server.
|
| 20 |
+
2026-02-27 00:03:47,111 INFO http_server_head.py:111 -- Setup static dir for dashboard: /usr/local/lib/python3.12/dist-packages/ray/dashboard/client/build
|
| 21 |
+
2026-02-27 00:03:47,114 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 22 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:440 -- Dashboard head http address: 127.0.0.1:8265
|
| 23 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /usage_stats_enabled> -> <function UsageStatsHead.get_usage_stats_enabled at 0x771373b853a0>
|
| 24 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /cluster_id> -> <function UsageStatsHead.get_cluster_id at 0x771373b854e0>
|
| 25 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /> -> <function HttpServerDashboardHead.get_index at 0x771373bc4a40>
|
| 26 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /favicon.ico> -> <function HttpServerDashboardHead.get_favicon at 0x771373bc4b80>
|
| 27 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /timezone> -> <function HttpServerDashboardHead.get_timezone at 0x771373bc4cc0>
|
| 28 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/authentication_mode> -> <function HttpServerDashboardHead.get_authentication_mode at 0x771373bc4e00>
|
| 29 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [POST] <PlainResource /api/authenticate> -> <function HttpServerDashboardHead.authenticate at 0x771373bc4f40>
|
| 30 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [GET] <StaticResource /static -> PosixPath('/usr/local/lib/python3.12/dist-packages/ray/dashboard/client/build/static')> -> <bound method StaticResource._handle of <StaticResource /static -> PosixPath('/usr/local/lib/python3.12/dist-packages/ray/dashboard/client/build/static')>>
|
| 31 |
+
2026-02-27 00:03:47,140 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/grafana_health> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713781eefc0>
|
| 32 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/prometheus_health> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713781ef100>
|
| 33 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /api/data/datasets/{job_id}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713781ef4c0>
|
| 34 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [POST] <PlainResource /report_events> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753c4400>
|
| 35 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /events> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753c45e0>
|
| 36 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/cluster_events> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753c4860>
|
| 37 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/version> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eac00>
|
| 38 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /api/packages/{protocol}/{package_name}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eade0>
|
| 39 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [PUT] <DynamicResource /api/packages/{protocol}/{package_name}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eb060>
|
| 40 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [POST] <PlainResource /api/jobs/> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eb240>
|
| 41 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [POST] <DynamicResource /api/jobs/{job_or_submission_id}/stop> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eb380>
|
| 42 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [DELETE] <DynamicResource /api/jobs/{job_or_submission_id}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eb4c0>
|
| 43 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /api/jobs/{job_or_submission_id}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eb600>
|
| 44 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/jobs/> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eb740>
|
| 45 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /api/jobs/{job_or_submission_id}/logs> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eb880>
|
| 46 |
+
2026-02-27 00:03:47,141 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /api/jobs/{job_or_submission_id}/logs/tail> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753eb9c0>
|
| 47 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/component_activities> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713753ebba0>
|
| 48 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /nodes> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713749b8a40>
|
| 49 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /nodes/{node_id}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713749b8c20>
|
| 50 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /logical/actors> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713749b9300>
|
| 51 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /logical/actors/{actor_id}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713749b94e0>
|
| 52 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /test/dump> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x7713749b96c0>
|
| 53 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/cluster_metadata> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b4f880>
|
| 54 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/cluster_status> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b4f9c0>
|
| 55 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /task/traceback> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b4fc40>
|
| 56 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /task/cpu_profile> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b4fd80>
|
| 57 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /worker/traceback> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b4fec0>
|
| 58 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /worker/cpu_profile> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b74040>
|
| 59 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /worker/gpu_profile> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b74180>
|
| 60 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /memory_profile> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b742c0>
|
| 61 |
+
2026-02-27 00:03:47,142 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/gcs_healthz> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b74400>
|
| 62 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/actors/kill> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b74540>
|
| 63 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/prometheus/sd> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b74680>
|
| 64 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/ray/version> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b75260>
|
| 65 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/serve/applications/> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b75300>
|
| 66 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [DELETE] <PlainResource /api/serve/applications/> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b75580>
|
| 67 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [PUT] <PlainResource /api/serve/applications/> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b75760>
|
| 68 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [POST] <DynamicResource /api/v1/applications/{application_name}/deployments/{deployment_name}/scale> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b75a80>
|
| 69 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/actors> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b763e0>
|
| 70 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/jobs> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b765c0>
|
| 71 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/nodes> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b767a0>
|
| 72 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/placement_groups> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b76980>
|
| 73 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/workers> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b76b60>
|
| 74 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/tasks> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b76d40>
|
| 75 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/objects> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b76f20>
|
| 76 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/runtime_envs> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b77100>
|
| 77 |
+
2026-02-27 00:03:47,143 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/logs> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b772e0>
|
| 78 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /api/v0/logs/{media_type}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b774c0>
|
| 79 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/tasks/summarize> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b776a0>
|
| 80 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/actors/summarize> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b77880>
|
| 81 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/objects/summarize> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b77a60>
|
| 82 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/v0/tasks/timeline> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b77c40>
|
| 83 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:447 -- <ResourceRoute [GET] <DynamicResource /api/v0/delay/{delay_s}> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b77d80>
|
| 84 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/train/v2/runs/v1> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b845e0>
|
| 85 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:447 -- <ResourceRoute [GET] <PlainResource /api/train/v2/runs> -> <function SubprocessRouteTable._register_route.<locals>._wrapper.<locals>.parent_side_handler at 0x771373b84a40>
|
| 86 |
+
2026-02-27 00:03:47,144 INFO http_server_head.py:448 -- Registered 63 routes.
|
| 87 |
+
2026-02-27 00:03:47,144 INFO head.py:440 -- http server initialized at 127.0.0.1:8265
|
| 88 |
+
2026-02-27 00:03:47,151 INFO usage_stats_head.py:200 -- Usage reporting is disabled.
|
test/dashboard.out
ADDED
|
File without changes
|
test/dashboard_DataHead.err
ADDED
|
File without changes
|
test/dashboard_DataHead.log
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:46,553 INFO module.py:210 -- Starting module DataHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:46,553 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:46,565 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_DataHead.
|
| 4 |
+
2026-02-27 00:03:46,565 INFO module.py:225 -- Module DataHead initialized, receiving messages...
|
| 5 |
+
2026-02-27 00:04:03,585 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_DataHead.out
ADDED
|
File without changes
|
test/dashboard_EventHead.err
ADDED
|
File without changes
|
test/dashboard_EventHead.log
ADDED
|
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:47,093 INFO module.py:210 -- Starting module EventHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:47,096 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:47,099 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_EventHead.
|
| 4 |
+
2026-02-27 00:03:47,100 INFO event_utils.py:130 -- Monitor events logs modified after 1772148826.8917453 on /tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs/events, the source types are all.
|
| 5 |
+
2026-02-27 00:03:47,100 INFO module.py:225 -- Module EventHead initialized, receiving messages...
|
| 6 |
+
2026-02-27 00:04:03,129 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_EventHead.out
ADDED
|
File without changes
|
test/dashboard_JobHead.err
ADDED
|
File without changes
|
test/dashboard_JobHead.log
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:46,845 INFO module.py:210 -- Starting module JobHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:46,848 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:46,853 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_JobHead.
|
| 4 |
+
2026-02-27 00:03:46,853 INFO module.py:225 -- Module JobHead initialized, receiving messages...
|
| 5 |
+
2026-02-27 00:04:02,879 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_JobHead.out
ADDED
|
File without changes
|
test/dashboard_MetricsHead.err
ADDED
|
File without changes
|
test/dashboard_MetricsHead.log
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:46,752 INFO module.py:210 -- Starting module MetricsHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:46,752 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:46,760 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_MetricsHead.
|
| 4 |
+
2026-02-27 00:03:46,838 INFO module.py:225 -- Module MetricsHead initialized, receiving messages...
|
| 5 |
+
2026-02-27 00:04:02,781 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_MetricsHead.out
ADDED
|
File without changes
|
test/dashboard_NodeHead.err
ADDED
|
File without changes
|
test/dashboard_NodeHead.log
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:46,975 INFO module.py:210 -- Starting module NodeHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:46,975 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:46,980 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_NodeHead.
|
| 4 |
+
2026-02-27 00:03:46,981 INFO module.py:225 -- Module NodeHead initialized, receiving messages...
|
| 5 |
+
2026-02-27 00:03:46,987 INFO node_head.py:570 -- Getting all actor info from GCS.
|
| 6 |
+
2026-02-27 00:03:46,988 INFO node_head.py:587 -- Received 0 actor info from GCS.
|
| 7 |
+
2026-02-27 00:04:03,007 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_NodeHead.out
ADDED
|
File without changes
|
test/dashboard_ReportHead.err
ADDED
|
File without changes
|
test/dashboard_ReportHead.log
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:47,027 INFO module.py:210 -- Starting module ReportHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:47,030 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:47,033 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_ReportHead.
|
| 4 |
+
2026-02-27 00:03:47,036 INFO module.py:225 -- Module ReportHead initialized, receiving messages...
|
| 5 |
+
2026-02-27 00:04:03,061 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_ReportHead.out
ADDED
|
File without changes
|
test/dashboard_ServeHead.err
ADDED
|
File without changes
|
test/dashboard_ServeHead.log
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:46,925 INFO module.py:210 -- Starting module ServeHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:46,930 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:46,935 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_ServeHead.
|
| 4 |
+
2026-02-27 00:03:46,935 INFO module.py:225 -- Module ServeHead initialized, receiving messages...
|
| 5 |
+
2026-02-27 00:04:02,959 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_ServeHead.out
ADDED
|
File without changes
|
test/dashboard_StateHead.err
ADDED
|
File without changes
|
test/dashboard_StateHead.log
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:46,954 INFO module.py:210 -- Starting module StateHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:46,958 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:46,963 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_StateHead.
|
| 4 |
+
2026-02-27 00:03:46,964 INFO module.py:225 -- Module StateHead initialized, receiving messages...
|
| 5 |
+
2026-02-27 00:04:02,990 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_StateHead.out
ADDED
|
File without changes
|
test/dashboard_TrainHead.err
ADDED
|
File without changes
|
test/dashboard_TrainHead.log
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:46,680 INFO module.py:210 -- Starting module TrainHead with incarnation 0 and config SubprocessModuleConfig(cluster_id_hex='53ef51bb0bb70a80ae057770eba1177484524b98986050e67bb3e439', gcs_address='10.128.0.163:57355', session_name='session_2026-02-27_00-03-44_103874_1384', temp_dir='/tmp/ray', session_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384', logging_level=20, logging_format='%(asctime)s\t%(levelname)s %(filename)s:%(lineno)s -- %(message)s', log_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs', logging_filename='dashboard.log', logging_rotate_bytes=536870912, logging_rotate_backup_count=5, socket_dir='/tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets')
|
| 2 |
+
2026-02-27 00:03:46,680 WARNING __init__.py:161 -- DeprecationWarning: `ray.ray_constants.DASHBOARD_CLIENT_MAX_SIZE` is a private attribute and access will be removed in a future Ray version.
|
| 3 |
+
2026-02-27 00:03:46,690 INFO module.py:142 -- Started aiohttp server over /tmp/ray/session_2026-02-27_00-03-44_103874_1384/sockets/dash_TrainHead.
|
| 4 |
+
2026-02-27 00:03:46,691 INFO module.py:225 -- Module TrainHead initialized, receiving messages...
|
| 5 |
+
2026-02-27 00:04:02,715 WARNING module.py:82 -- Parent process 1506 died. Exiting...
|
test/dashboard_TrainHead.out
ADDED
|
File without changes
|
test/dashboard_agent.err
ADDED
|
File without changes
|
test/dashboard_agent.log
ADDED
|
@@ -0,0 +1,77 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2026-02-27 00:03:48,646 INFO agent.py:141 -- Dashboard agent grpc address: 10.128.0.163:51882
|
| 2 |
+
2026-02-27 00:03:48,647 INFO utils.py:307 -- Get all modules by type: DashboardAgentModule
|
| 3 |
+
2026-02-27 00:03:48,982 INFO utils.py:340 -- Available modules: [<class 'ray.dashboard.modules.aggregator.aggregator_agent.AggregatorAgent'>, <class 'ray.dashboard.modules.event.event_agent.EventAgent'>, <class 'ray.dashboard.modules.job.job_agent.JobAgent'>, <class 'ray.dashboard.modules.log.log_agent.LogAgent'>, <class 'ray.dashboard.modules.log.log_agent.LogAgentV1Grpc'>, <class 'ray.dashboard.modules.reporter.healthz_agent.HealthzAgent'>, <class 'ray.dashboard.modules.reporter.reporter_agent.ReporterAgent'>]
|
| 4 |
+
2026-02-27 00:03:48,982 INFO agent.py:160 -- Loading DashboardAgentModule: <class 'ray.dashboard.modules.aggregator.aggregator_agent.AggregatorAgent'>
|
| 5 |
+
2026-02-27 00:03:48,983 WARNING __init__.py:864 -- Overriding of current MeterProvider is not allowed
|
| 6 |
+
2026-02-27 00:03:48,984 INFO aggregator_agent.py:139 -- Event HTTP target is not enabled or publishing events to external HTTP service is disabled. Skipping sending events to external HTTP service. events_export_addr:
|
| 7 |
+
2026-02-27 00:03:48,985 WARNING __init__.py:864 -- Overriding of current MeterProvider is not allowed
|
| 8 |
+
2026-02-27 00:03:48,985 INFO agent.py:160 -- Loading DashboardAgentModule: <class 'ray.dashboard.modules.event.event_agent.EventAgent'>
|
| 9 |
+
2026-02-27 00:03:48,985 INFO event_agent.py:48 -- Event agent cache buffer size: 10240
|
| 10 |
+
2026-02-27 00:03:48,986 INFO agent.py:160 -- Loading DashboardAgentModule: <class 'ray.dashboard.modules.job.job_agent.JobAgent'>
|
| 11 |
+
2026-02-27 00:03:48,986 INFO agent.py:160 -- Loading DashboardAgentModule: <class 'ray.dashboard.modules.log.log_agent.LogAgent'>
|
| 12 |
+
2026-02-27 00:03:48,986 INFO agent.py:160 -- Loading DashboardAgentModule: <class 'ray.dashboard.modules.log.log_agent.LogAgentV1Grpc'>
|
| 13 |
+
2026-02-27 00:03:48,986 INFO agent.py:160 -- Loading DashboardAgentModule: <class 'ray.dashboard.modules.reporter.healthz_agent.HealthzAgent'>
|
| 14 |
+
2026-02-27 00:03:48,986 INFO agent.py:160 -- Loading DashboardAgentModule: <class 'ray.dashboard.modules.reporter.reporter_agent.ReporterAgent'>
|
| 15 |
+
2026-02-27 00:03:48,993 WARNING __init__.py:864 -- Overriding of current MeterProvider is not allowed
|
| 16 |
+
2026-02-27 00:03:49,141 WARNING gpu_profile_manager.py:82 -- [GpuProfilingManager] `dynolog` is not installed, GPU profiling will not be available.
|
| 17 |
+
2026-02-27 00:03:49,141 WARNING gpu_profile_manager.py:125 -- [GpuProfilingManager] GPU profiling is disabled, skipping daemon setup.
|
| 18 |
+
2026-02-27 00:03:49,142 INFO agent.py:165 -- Loaded 7 modules.
|
| 19 |
+
2026-02-27 00:03:49,145 WARNING http_server_agent.py:70 -- Failed to bind to port 52365 (attempt 1/6). Retrying in 0.19s. Error: [Errno 98] error while attempting to bind on address ('10.128.0.163', 52365): [errno 98] address already in use
|
| 20 |
+
2026-02-27 00:03:49,341 WARNING http_server_agent.py:70 -- Failed to bind to port 52365 (attempt 2/6). Retrying in 0.22s. Error: [Errno 98] error while attempting to bind on address ('10.128.0.163', 52365): [errno 98] address already in use
|
| 21 |
+
2026-02-27 00:03:49,562 WARNING http_server_agent.py:70 -- Failed to bind to port 52365 (attempt 3/6). Retrying in 0.44s. Error: [Errno 98] error while attempting to bind on address ('10.128.0.163', 52365): [errno 98] address already in use
|
| 22 |
+
2026-02-27 00:03:50,003 WARNING http_server_agent.py:70 -- Failed to bind to port 52365 (attempt 4/6). Retrying in 0.85s. Error: [Errno 98] error while attempting to bind on address ('10.128.0.163', 52365): [errno 98] address already in use
|
| 23 |
+
2026-02-27 00:03:50,856 WARNING http_server_agent.py:70 -- Failed to bind to port 52365 (attempt 5/6). Retrying in 1.67s. Error: [Errno 98] error while attempting to bind on address ('10.128.0.163', 52365): [errno 98] address already in use
|
| 24 |
+
2026-02-27 00:03:52,524 ERROR http_server_agent.py:76 -- Agent port #52365 failed to bind after 6 attempts.
|
| 25 |
+
Traceback (most recent call last):
|
| 26 |
+
File "/usr/local/lib/python3.12/dist-packages/ray/dashboard/http_server_agent.py", line 50, in _start_site_with_retry
|
| 27 |
+
await site.start()
|
| 28 |
+
File "/usr/local/lib/python3.12/dist-packages/aiohttp/web_runner.py", line 121, in start
|
| 29 |
+
self._server = await loop.create_server(
|
| 30 |
+
^^^^^^^^^^^^^^^^^^^^^^^^^
|
| 31 |
+
File "/usr/lib/python3.12/asyncio/base_events.py", line 1584, in create_server
|
| 32 |
+
raise OSError(err.errno, msg) from None
|
| 33 |
+
OSError: [Errno 98] error while attempting to bind on address ('10.128.0.163', 52365): [errno 98] address already in use
|
| 34 |
+
2026-02-27 00:03:52,526 ERROR agent.py:195 -- Failed to start HTTP server with exception: [Errno 98] error while attempting to bind on address ('10.128.0.163', 52365): [errno 98] address already in use. The agent will stay alive but the HTTP service will be disabled.
|
| 35 |
+
Traceback (most recent call last):
|
| 36 |
+
File "/usr/local/lib/python3.12/dist-packages/ray/dashboard/agent.py", line 188, in run
|
| 37 |
+
await self.http_server.start(modules)
|
| 38 |
+
File "/usr/local/lib/python3.12/dist-packages/ray/dashboard/http_server_agent.py", line 120, in start
|
| 39 |
+
site = await self._start_site_with_retry()
|
| 40 |
+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
| 41 |
+
File "/usr/local/lib/python3.12/dist-packages/ray/dashboard/http_server_agent.py", line 83, in _start_site_with_retry
|
| 42 |
+
raise last_exception
|
| 43 |
+
File "/usr/local/lib/python3.12/dist-packages/ray/dashboard/http_server_agent.py", line 50, in _start_site_with_retry
|
| 44 |
+
await site.start()
|
| 45 |
+
File "/usr/local/lib/python3.12/dist-packages/aiohttp/web_runner.py", line 121, in start
|
| 46 |
+
self._server = await loop.create_server(
|
| 47 |
+
^^^^^^^^^^^^^^^^^^^^^^^^^
|
| 48 |
+
File "/usr/lib/python3.12/asyncio/base_events.py", line 1584, in create_server
|
| 49 |
+
raise OSError(err.errno, msg) from None
|
| 50 |
+
OSError: [Errno 98] error while attempting to bind on address ('10.128.0.163', 52365): [errno 98] address already in use
|
| 51 |
+
2026-02-27 00:03:52,527 INFO process_watcher.py:45 -- raylet pid is 1875
|
| 52 |
+
2026-02-27 00:03:52,528 INFO process_watcher.py:65 -- check_parent_via_pipe
|
| 53 |
+
2026-02-27 00:03:52,528 INFO event_utils.py:130 -- Monitor events logs modified after 1772148828.733918 on /tmp/ray/session_2026-02-27_00-03-44_103874_1384/logs/events, the source types are all.
|
| 54 |
+
2026-02-27 00:03:52,554 INFO gpu_providers.py:500 -- Using GPU Provider: NvidiaGpuProvider
|
| 55 |
+
2026-02-27 00:04:02,600 INFO agent.py:228 -- Terminated Raylet: ip=10.128.0.163, node_id=cf562760d44bbe7c695ad3e8c246c3a8d992e4ef7594a5654c4c19c7. _check_parent_via_pipe: The parent is dead.
|
| 56 |
+
2026-02-27 00:04:02,600 ERROR process_watcher.py:115 -- Raylet is terminated. Termination is unexpected. Possible reasons include: (1) SIGKILL by the user or system OOM killer, (2) Invalid memory access from Raylet causing SIGSEGV or SIGBUS, (3) Other termination signals. Last 20 lines of the Raylet logs:
|
| 57 |
+
[2026-02-27 00:03:47,363 I 1875 1875] (raylet) accessor.cc:540: Received address and liveness notification for node, IsAlive = 1 node_id=cf562760d44bbe7c695ad3e8c246c3a8d992e4ef7594a5654c4c19c7
|
| 58 |
+
[2026-02-27 00:03:47,439 I 1875 1875] (raylet) worker_pool.cc:750: [Eagerly] Start install runtime environment for job 01000000.
|
| 59 |
+
[2026-02-27 00:03:47,442 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 1922, the token is 0
|
| 60 |
+
[2026-02-27 00:03:47,445 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 1923, the token is 1
|
| 61 |
+
[2026-02-27 00:03:47,448 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 1924, the token is 2
|
| 62 |
+
[2026-02-27 00:03:47,451 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 1925, the token is 3
|
| 63 |
+
[2026-02-27 00:03:47,454 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 1926, the token is 4
|
| 64 |
+
[2026-02-27 00:03:47,457 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 1927, the token is 5
|
| 65 |
+
[2026-02-27 00:03:47,461 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 1928, the token is 6
|
| 66 |
+
[2026-02-27 00:03:47,466 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 1929, the token is 7
|
| 67 |
+
[2026-02-27 00:03:47,469 I 1875 1875] (raylet) runtime_env_agent_client.cc:350: Runtime Env Agent network error: NotFound: on_connect Connection refused, the server may be still starting or is already failed. Scheduling a retry in 1000ms...
|
| 68 |
+
[2026-02-27 00:03:48,199 I 1875 1890] (raylet) object_store.cc:37: Object store current usage 8e-09 / 9.52932 GB.
|
| 69 |
+
[2026-02-27 00:03:48,474 I 1875 1875] (raylet) runtime_env_agent_client.cc:393: Create runtime env for job 01000000
|
| 70 |
+
[2026-02-27 00:03:48,474 I 1875 1875] (raylet) worker_pool.cc:761: [Eagerly] Create runtime env successful for job 01000000.
|
| 71 |
+
[2026-02-27 00:03:48,572 I 1875 1875] (raylet) worker_pool.cc:740: Job 01000000 already started in worker pool.
|
| 72 |
+
[2026-02-27 00:03:49,568 I 1875 1875] (raylet) node_manager.cc:1437: Disconnecting worker, graceful=true, disconnect_type=1, has_creation_task_exception=false worker_id=29b15f3dea7d69a07782de663871cc2d93a35120d41e29d152021111 job_id=NIL_ID
|
| 73 |
+
[2026-02-27 00:03:49,644 W 1875 1890] (raylet) store.cc:365: Disconnecting client due to connection error with code 2: End of file
|
| 74 |
+
[2026-02-27 00:03:51,471 I 1875 1875] (raylet) runtime_env_agent_client.cc:393: Create runtime env for job 01000000
|
| 75 |
+
[2026-02-27 00:03:51,474 I 1875 1875] (raylet) worker_pool.cc:531: Started worker process with pid 2372, the token is 8
|
| 76 |
+
[2026-02-27 00:03:53,363 I 1875 1875] (raylet) metrics_agent_client.cc:54: Exporter initialized.
|
| 77 |
+
|
test/dashboard_agent.out
ADDED
|
File without changes
|
test/debug_state.txt
ADDED
|
@@ -0,0 +1,199 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
NodeManager:
|
| 2 |
+
Node ID: cf562760d44bbe7c695ad3e8c246c3a8d992e4ef7594a5654c4c19c7
|
| 3 |
+
Node name: 10.128.0.163
|
| 4 |
+
InitialConfigResources: {GPU: 1, node:__internal_head__: 1, object_store_memory: 9.52932e+09, memory: 2.22351e+10, CPU: 8, node:10.128.0.163: 1, accelerator_type:L4: 1}
|
| 5 |
+
ClusterLeaseManager:
|
| 6 |
+
========== Node: cf562760d44bbe7c695ad3e8c246c3a8d992e4ef7594a5654c4c19c7 =================
|
| 7 |
+
Infeasible queue length: 0
|
| 8 |
+
Schedule queue length: 0
|
| 9 |
+
Grant queue length: 0
|
| 10 |
+
num_waiting_for_resource: 0
|
| 11 |
+
num_waiting_for_plasma_memory: 0
|
| 12 |
+
num_waiting_for_remote_node_resources: 0
|
| 13 |
+
num_worker_not_started_by_job_config_not_exist: 0
|
| 14 |
+
num_worker_not_started_by_registration_timeout: 0
|
| 15 |
+
num_tasks_waiting_for_workers: 0
|
| 16 |
+
num_cancelled_leases: 0
|
| 17 |
+
cluster_resource_scheduler state:
|
| 18 |
+
Local id: -7930779791598977017 Local resources: {"total":{GPU: [10000], node:10.128.0.163: [10000], node:__internal_head__: [10000], CPU: [80000], memory: [222350872580000], object_store_memory: [95293231100000], accelerator_type:L4: [10000]}}, "available": {GPU: [10000], node:10.128.0.163: [10000], node:__internal_head__: [10000], CPU: [70000], memory: [222350872580000], object_store_memory: [95293231100000], accelerator_type:L4: [10000]}}, "labels":{"ray.io/accelerator-type":"L4","ray.io/node-id":"cf562760d44bbe7c695ad3e8c246c3a8d992e4ef7594a5654c4c19c7",} is_draining: 0 is_idle: 0 Cluster resources (at most 20 nodes are shown): node id: -7930779791598977017{"total":{accelerator_type:L4: 10000, node:__internal_head__: 10000, node:10.128.0.163: 10000, object_store_memory: 95293231100000, CPU: 80000, GPU: 10000, memory: 222350872580000}}, "available": {CPU: 70000, node:10.128.0.163: 10000, node:__internal_head__: 10000, object_store_memory: 95293231100000, accelerator_type:L4: 10000, GPU: 10000, memory: 222350872580000}}, "labels":{"ray.io/node-id":"cf562760d44bbe7c695ad3e8c246c3a8d992e4ef7594a5654c4c19c7","ray.io/accelerator-type":"L4",}, "is_draining": 0, "draining_deadline_timestamp_ms": -1} { "placement group locations": [], "node to bundles": []}
|
| 19 |
+
Waiting leases size: 0
|
| 20 |
+
Number of granted lease arguments: 1
|
| 21 |
+
Number of pinned lease arguments: 0
|
| 22 |
+
Number of total spilled leases: 0
|
| 23 |
+
Number of spilled waiting leases: 0
|
| 24 |
+
Number of spilled unschedulable leases: 0
|
| 25 |
+
Resource usage {
|
| 26 |
+
- (language=PYTHON actor_or_taskTaskRunner.__init__ pid=2410 worker_id=a90c7d025be10c8b52ddb0c367136d57136cfc58ba507147879e9508): {CPU: 1}
|
| 27 |
+
}
|
| 28 |
+
Backlog Size per scheduling descriptor :{workerId: num backlogs}:
|
| 29 |
+
|
| 30 |
+
Granted leases by scheduling class:
|
| 31 |
+
- {depth=1 function_descriptor={type=PythonFunctionDescriptor, module_name=main_ppo, class_name=TaskRunner, function_name=__init__, function_hash=bd0c197dfe784848a2857f58f0c85f47} scheduling_strategy=default_scheduling_strategy {
|
| 32 |
+
}
|
| 33 |
+
resource_set={CPU : 1, }label_selector={}}fallback_strategy=[]: 1/8
|
| 34 |
+
==================================================
|
| 35 |
+
|
| 36 |
+
ClusterResources:
|
| 37 |
+
LocalObjectManager:
|
| 38 |
+
- num pinned objects: 0
|
| 39 |
+
- pinned objects size: 0
|
| 40 |
+
- num objects pending restore: 0
|
| 41 |
+
- num objects pending spill: 0
|
| 42 |
+
- num bytes pending spill: 0
|
| 43 |
+
- num bytes currently spilled: 0
|
| 44 |
+
- cumulative spill requests: 0
|
| 45 |
+
- cumulative restore requests: 0
|
| 46 |
+
- spilled objects pending delete: 0
|
| 47 |
+
|
| 48 |
+
ObjectManager:
|
| 49 |
+
- num local objects: 0
|
| 50 |
+
- num unfulfilled push requests: 0
|
| 51 |
+
- num object pull requests: 0
|
| 52 |
+
- num chunks received total: 0
|
| 53 |
+
- num chunks received failed (all): 0
|
| 54 |
+
- num chunks received failed / cancelled: 0
|
| 55 |
+
- num chunks received failed / plasma error: 0
|
| 56 |
+
Event stats:
|
| 57 |
+
Global stats: 0 total (0 active)
|
| 58 |
+
Queueing time: mean = -nanms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 59 |
+
Execution time: mean = -nanms, total = 0.00ms
|
| 60 |
+
Event stats:
|
| 61 |
+
PushManager:
|
| 62 |
+
- num pushes remaining: 0
|
| 63 |
+
- num chunks in flight: 0
|
| 64 |
+
- num chunks remaining: 0
|
| 65 |
+
- max chunks allowed: 409
|
| 66 |
+
OwnershipBasedObjectDirectory:
|
| 67 |
+
- num listeners: 0
|
| 68 |
+
- cumulative location updates: 0
|
| 69 |
+
- num location updates per second: 0.000
|
| 70 |
+
- num location lookups per second: 0.000
|
| 71 |
+
- num locations added per second: 0.000
|
| 72 |
+
- num locations removed per second: 0.000
|
| 73 |
+
BufferPool:
|
| 74 |
+
- create buffer state map size: 0
|
| 75 |
+
PullManager:
|
| 76 |
+
- num bytes available for pulled objects: 9529323110
|
| 77 |
+
- num bytes being pulled (all): 0
|
| 78 |
+
- num bytes being pulled / pinned: 0
|
| 79 |
+
- get request bundles: BundlePullRequestQueue{0 total, 0 active, 0 inactive, 0 unpullable}
|
| 80 |
+
- wait request bundles: BundlePullRequestQueue{0 total, 0 active, 0 inactive, 0 unpullable}
|
| 81 |
+
- task request bundles: BundlePullRequestQueue{0 total, 0 active, 0 inactive, 0 unpullable}
|
| 82 |
+
- first get request bundle: N/A
|
| 83 |
+
- first wait request bundle: N/A
|
| 84 |
+
- first task request bundle: N/A
|
| 85 |
+
- num objects queued: 0
|
| 86 |
+
- num objects actively pulled (all): 0
|
| 87 |
+
- num objects actively pulled / pinned: 0
|
| 88 |
+
- num bundles being pulled: 0
|
| 89 |
+
- num pull retries: 0
|
| 90 |
+
- max timeout seconds: 0
|
| 91 |
+
- max timeout request is already processed. No entry.
|
| 92 |
+
|
| 93 |
+
WorkerPool:
|
| 94 |
+
- registered jobs: 1
|
| 95 |
+
- process_failed_job_config_missing: 0
|
| 96 |
+
- process_failed_rate_limited: 0
|
| 97 |
+
- process_failed_pending_registration: 0
|
| 98 |
+
- process_failed_runtime_env_setup_failed: 0
|
| 99 |
+
- num PYTHON workers: 8
|
| 100 |
+
- num PYTHON drivers: 1
|
| 101 |
+
- num PYTHON pending start requests: 0
|
| 102 |
+
- num PYTHON pending registration requests: 0
|
| 103 |
+
- num object spill callbacks queued: 0
|
| 104 |
+
- num object restore queued: 0
|
| 105 |
+
- num util functions queued: 0
|
| 106 |
+
- num idle workers: 7
|
| 107 |
+
LeaseDependencyManager:
|
| 108 |
+
- lease deps map size: 0
|
| 109 |
+
- get req map size: 0
|
| 110 |
+
- wait req map size: 0
|
| 111 |
+
- local objects map size: 0
|
| 112 |
+
WaitManager:
|
| 113 |
+
- num active wait requests: 0
|
| 114 |
+
Subscriber:
|
| 115 |
+
Channel WORKER_REF_REMOVED_CHANNEL
|
| 116 |
+
- cumulative subscribe requests: 0
|
| 117 |
+
- cumulative unsubscribe requests: 0
|
| 118 |
+
- active subscribed publishers: 0
|
| 119 |
+
- cumulative published messages: 0
|
| 120 |
+
- cumulative processed messages: 0
|
| 121 |
+
Channel WORKER_OBJECT_LOCATIONS_CHANNEL
|
| 122 |
+
- cumulative subscribe requests: 0
|
| 123 |
+
- cumulative unsubscribe requests: 0
|
| 124 |
+
- active subscribed publishers: 0
|
| 125 |
+
- cumulative published messages: 0
|
| 126 |
+
- cumulative processed messages: 0
|
| 127 |
+
Channel WORKER_OBJECT_EVICTION
|
| 128 |
+
- cumulative subscribe requests: 0
|
| 129 |
+
- cumulative unsubscribe requests: 0
|
| 130 |
+
- active subscribed publishers: 0
|
| 131 |
+
- cumulative published messages: 0
|
| 132 |
+
- cumulative processed messages: 0
|
| 133 |
+
num async plasma notifications: 0
|
| 134 |
+
Event stats:
|
| 135 |
+
Global stats: 826 total (24 active)
|
| 136 |
+
Queueing time: mean = 10.27ms, max = 1099.92ms, min = 0.00ms, total = 8481.77ms
|
| 137 |
+
Execution time: mean = 8.39ms, total = 6929.63ms
|
| 138 |
+
Event stats:
|
| 139 |
+
RaySyncer.OnDemandBroadcasting - 100 total (1 active), Execution time: mean = 0.02ms, total = 1.77ms, Queueing time: mean = 0.14ms, max = 6.15ms, min = 0.02ms, total = 14.11ms
|
| 140 |
+
ObjectManager.UpdateAvailableMemory - 100 total (0 active), Execution time: mean = 0.00ms, total = 0.37ms, Queueing time: mean = 0.02ms, max = 0.21ms, min = 0.01ms, total = 2.41ms
|
| 141 |
+
NodeManager.CheckGC - 100 total (1 active), Execution time: mean = 0.00ms, total = 0.25ms, Queueing time: mean = 0.16ms, max = 6.17ms, min = 0.02ms, total = 15.50ms
|
| 142 |
+
NodeManagerService.grpc_server.ReportWorkerBacklog.HandleRequestImpl - 81 total (0 active), Execution time: mean = 0.07ms, total = 5.86ms, Queueing time: mean = 0.04ms, max = 0.47ms, min = 0.00ms, total = 2.88ms
|
| 143 |
+
NodeManagerService.grpc_server.ReportWorkerBacklog - 81 total (0 active), Execution time: mean = 0.31ms, total = 25.34ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 144 |
+
RayletWorkerPool.deadline_timer.kill_idle_workers - 50 total (1 active), Execution time: mean = 0.02ms, total = 0.95ms, Queueing time: mean = 0.12ms, max = 2.61ms, min = 0.02ms, total = 6.00ms
|
| 145 |
+
ClientConnection.async_read.ProcessMessageHeader - 41 total (9 active), Execution time: mean = 0.01ms, total = 0.21ms, Queueing time: mean = 56.87ms, max = 1099.92ms, min = 0.02ms, total = 2331.78ms
|
| 146 |
+
MemoryMonitor.CheckIsMemoryUsageAboveThreshold - 40 total (1 active), Execution time: mean = 0.23ms, total = 9.26ms, Queueing time: mean = 0.11ms, max = 3.20ms, min = 0.02ms, total = 4.44ms
|
| 147 |
+
ClientConnection.async_read.ProcessMessage - 32 total (0 active), Execution time: mean = 1.12ms, total = 35.92ms, Queueing time: mean = 0.01ms, max = 0.13ms, min = 0.00ms, total = 0.47ms
|
| 148 |
+
PeriodicalRunner.RunFnPeriodically - 14 total (0 active), Execution time: mean = 0.22ms, total = 3.15ms, Queueing time: mean = 6.03ms, max = 17.68ms, min = 0.05ms, total = 84.36ms
|
| 149 |
+
NodeManager.ScheduleAndGrantLeases - 11 total (1 active), Execution time: mean = 0.01ms, total = 0.16ms, Queueing time: mean = 0.06ms, max = 0.17ms, min = 0.02ms, total = 0.61ms
|
| 150 |
+
NodeManager.CheckForUnexpectedWorkerDisconnects - 11 total (1 active), Execution time: mean = 0.02ms, total = 0.18ms, Queueing time: mean = 0.05ms, max = 0.17ms, min = 0.01ms, total = 0.59ms
|
| 151 |
+
ClientConnection.async_write.DoAsyncWrites - 11 total (0 active), Execution time: mean = 0.00ms, total = 0.01ms, Queueing time: mean = 0.05ms, max = 0.31ms, min = 0.02ms, total = 0.53ms
|
| 152 |
+
NodeManagerService.grpc_server.GetSystemConfig.HandleRequestImpl - 10 total (0 active), Execution time: mean = 0.17ms, total = 1.74ms, Queueing time: mean = 0.08ms, max = 0.40ms, min = 0.01ms, total = 0.75ms
|
| 153 |
+
NodeManager.deadline_timer.flush_free_objects - 10 total (1 active), Execution time: mean = 0.00ms, total = 0.04ms, Queueing time: mean = 0.18ms, max = 1.58ms, min = 0.02ms, total = 1.81ms
|
| 154 |
+
ObjectManager.ObjectAdded - 10 total (0 active), Execution time: mean = 0.02ms, total = 0.19ms, Queueing time: mean = 0.13ms, max = 0.57ms, min = 0.02ms, total = 1.31ms
|
| 155 |
+
NodeManagerService.grpc_server.GetResourceLoad - 10 total (0 active), Execution time: mean = 0.44ms, total = 4.40ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 156 |
+
NodeManagerService.grpc_server.GetSystemConfig - 10 total (0 active), Execution time: mean = 0.65ms, total = 6.45ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 157 |
+
ObjectManager.ObjectDeleted - 10 total (0 active), Execution time: mean = 0.02ms, total = 0.16ms, Queueing time: mean = 0.08ms, max = 0.38ms, min = 0.02ms, total = 0.78ms
|
| 158 |
+
NodeManager.deadline_timer.spill_objects_when_over_threshold - 10 total (1 active), Execution time: mean = 0.00ms, total = 0.02ms, Queueing time: mean = 0.18ms, max = 1.58ms, min = 0.02ms, total = 1.82ms
|
| 159 |
+
NodeManagerService.grpc_server.GetResourceLoad.HandleRequestImpl - 10 total (0 active), Execution time: mean = 0.14ms, total = 1.44ms, Queueing time: mean = 0.03ms, max = 0.05ms, min = 0.02ms, total = 0.34ms
|
| 160 |
+
ReporterService.grpc_client.HealthCheck - 7 total (0 active), Execution time: mean = 0.79ms, total = 5.53ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 161 |
+
ReporterService.grpc_client.HealthCheck.OnReplyReceived - 7 total (0 active), Execution time: mean = 0.09ms, total = 0.65ms, Queueing time: mean = 0.04ms, max = 0.06ms, min = 0.02ms, total = 0.27ms
|
| 162 |
+
MetricsAgentClient.WaitForServerReadyWithRetry - 6 total (0 active), Execution time: mean = 0.20ms, total = 1.21ms, Queueing time: mean = 1000.03ms, max = 1000.07ms, min = 1000.01ms, total = 6000.21ms
|
| 163 |
+
- 4 total (0 active), Execution time: mean = 0.00ms, total = 0.00ms, Queueing time: mean = 0.05ms, max = 0.13ms, min = 0.02ms, total = 0.22ms
|
| 164 |
+
RaySyncer.BroadcastMessage - 4 total (0 active), Execution time: mean = 0.12ms, total = 0.49ms, Queueing time: mean = 0.00ms, max = 0.00ms, min = 0.00ms, total = 0.00ms
|
| 165 |
+
ClusterResourceManager.ResetRemoteNodeView - 4 total (1 active), Execution time: mean = 0.01ms, total = 0.03ms, Queueing time: mean = 0.03ms, max = 0.04ms, min = 0.03ms, total = 0.10ms
|
| 166 |
+
ray::rpc::InternalPubSubGcsService.grpc_client.GcsSubscriberPoll - 3 total (1 active), Execution time: mean = 737.29ms, total = 2211.87ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 167 |
+
event_loop_lag_probe - 3 total (0 active), Execution time: mean = 0.01ms, total = 0.03ms, Queueing time: mean = 1.95ms, max = 5.83ms, min = 0.00ms, total = 5.84ms
|
| 168 |
+
ray::rpc::InternalPubSubGcsService.grpc_client.GcsSubscriberCommandBatch.OnReplyReceived - 2 total (0 active), Execution time: mean = 0.20ms, total = 0.41ms, Queueing time: mean = 1.19ms, max = 2.36ms, min = 0.01ms, total = 2.37ms
|
| 169 |
+
RaySyncerRegister - 2 total (0 active), Execution time: mean = 0.02ms, total = 0.04ms, Queueing time: mean = 0.00ms, max = 0.00ms, min = 0.00ms, total = 0.00ms
|
| 170 |
+
ray::rpc::InternalPubSubGcsService.grpc_client.GcsSubscriberPoll.OnReplyReceived - 2 total (0 active), Execution time: mean = 0.33ms, total = 0.67ms, Queueing time: mean = 0.28ms, max = 0.42ms, min = 0.14ms, total = 0.56ms
|
| 171 |
+
NodeManager.deadline_timer.record_metrics - 2 total (1 active), Execution time: mean = 0.15ms, total = 0.29ms, Queueing time: mean = 0.10ms, max = 0.21ms, min = 0.21ms, total = 0.21ms
|
| 172 |
+
ray::rpc::InternalPubSubGcsService.grpc_client.GcsSubscriberCommandBatch - 2 total (0 active), Execution time: mean = 0.98ms, total = 1.96ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 173 |
+
ray::rpc::WorkerInfoGcsService.grpc_client.ReportWorkerFailure.OnReplyReceived - 1 total (0 active), Execution time: mean = 0.02ms, total = 0.02ms, Queueing time: mean = 0.57ms, max = 0.57ms, min = 0.57ms, total = 0.57ms
|
| 174 |
+
CoreWorkerService.grpc_client.Exit - 1 total (0 active), Execution time: mean = 1.28ms, total = 1.28ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 175 |
+
ray::rpc::NodeInfoGcsService.grpc_client.CheckAlive.OnReplyReceived - 1 total (0 active), Execution time: mean = 0.02ms, total = 0.02ms, Queueing time: mean = 0.02ms, max = 0.02ms, min = 0.02ms, total = 0.02ms
|
| 176 |
+
Subscriber.HandlePublishedMessage_GCS_WORKER_DELTA_CHANNEL - 1 total (0 active), Execution time: mean = 0.01ms, total = 0.01ms, Queueing time: mean = 0.21ms, max = 0.21ms, min = 0.21ms, total = 0.21ms
|
| 177 |
+
Subscriber.HandlePublishedMessage_GCS_JOB_CHANNEL - 1 total (0 active), Execution time: mean = 0.09ms, total = 0.09ms, Queueing time: mean = 0.47ms, max = 0.47ms, min = 0.47ms, total = 0.47ms
|
| 178 |
+
NodeManager.GCTaskFailureReason - 1 total (1 active), Execution time: mean = 0.00ms, total = 0.00ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 179 |
+
ray::rpc::NodeInfoGcsService.grpc_client.CheckAlive - 1 total (0 active), Execution time: mean = 0.73ms, total = 0.73ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 180 |
+
ray::rpc::JobInfoGcsService.grpc_client.GetAllJobInfo - 1 total (0 active), Execution time: mean = 0.77ms, total = 0.77ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 181 |
+
ray::rpc::JobInfoGcsService.grpc_client.AddJob - 1 total (0 active), Execution time: mean = 0.92ms, total = 0.92ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 182 |
+
ray::rpc::NodeInfoGcsService.grpc_client.RegisterNode - 1 total (0 active), Execution time: mean = 1.14ms, total = 1.14ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 183 |
+
NodeManagerService.grpc_server.RequestWorkerLease.HandleRequestImpl - 1 total (0 active), Execution time: mean = 0.25ms, total = 0.25ms, Queueing time: mean = 0.04ms, max = 0.04ms, min = 0.04ms, total = 0.04ms
|
| 184 |
+
ray::rpc::NodeInfoGcsService.grpc_client.GetAllNodeAddressAndLiveness.OnReplyReceived - 1 total (0 active), Execution time: mean = 0.23ms, total = 0.23ms, Queueing time: mean = 0.03ms, max = 0.03ms, min = 0.03ms, total = 0.03ms
|
| 185 |
+
ray::rpc::JobInfoGcsService.grpc_client.GetAllJobInfo.OnReplyReceived - 1 total (0 active), Execution time: mean = 0.02ms, total = 0.02ms, Queueing time: mean = 0.03ms, max = 0.03ms, min = 0.03ms, total = 0.03ms
|
| 186 |
+
NodeManagerService.grpc_server.GetWorkerPIDs.HandleRequestImpl - 1 total (0 active), Execution time: mean = 0.12ms, total = 0.12ms, Queueing time: mean = 0.02ms, max = 0.02ms, min = 0.02ms, total = 0.02ms
|
| 187 |
+
ray::rpc::InternalKVGcsService.grpc_client.GetInternalConfig - 1 total (0 active), Execution time: mean = 0.79ms, total = 0.79ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 188 |
+
NodeManagerService.grpc_server.GetWorkerPIDs - 1 total (0 active), Execution time: mean = 0.40ms, total = 0.40ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 189 |
+
ray::rpc::WorkerInfoGcsService.grpc_client.ReportWorkerFailure - 1 total (0 active), Execution time: mean = 2.88ms, total = 2.88ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 190 |
+
ray::rpc::NodeInfoGcsService.grpc_client.RegisterNode.OnReplyReceived - 1 total (0 active), Execution time: mean = 0.40ms, total = 0.40ms, Queueing time: mean = 0.02ms, max = 0.02ms, min = 0.02ms, total = 0.02ms
|
| 191 |
+
NodeManager.GcsCheckAlive - 1 total (1 active), Execution time: mean = 0.00ms, total = 0.00ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 192 |
+
ray::rpc::InternalKVGcsService.grpc_client.GetInternalConfig.OnReplyReceived - 1 total (0 active), Execution time: mean = 22.86ms, total = 22.86ms, Queueing time: mean = 0.02ms, max = 0.02ms, min = 0.02ms, total = 0.02ms
|
| 193 |
+
ray::rpc::NodeInfoGcsService.grpc_client.GetAllNodeAddressAndLiveness - 1 total (0 active), Execution time: mean = 0.82ms, total = 0.82ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 194 |
+
NodeManager.deadline_timer.print_event_loop_stats - 1 total (1 active), Execution time: mean = 0.00ms, total = 0.00ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 195 |
+
NodeManager.deadline_timer.debug_state_dump - 1 total (1 active, 1 running), Execution time: mean = 0.00ms, total = 0.00ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 196 |
+
CoreWorkerService.grpc_client.Exit.OnReplyReceived - 1 total (0 active), Execution time: mean = 0.03ms, total = 0.03ms, Queueing time: mean = 0.03ms, max = 0.03ms, min = 0.03ms, total = 0.03ms
|
| 197 |
+
NodeManagerService.grpc_server.RequestWorkerLease - 1 total (0 active), Execution time: mean = 4574.75ms, total = 4574.75ms, Queueing time: mean = 0.00ms, max = -0.00ms, min = 9223372036854.78ms, total = 0.00ms
|
| 198 |
+
ray::rpc::JobInfoGcsService.grpc_client.AddJob.OnReplyReceived - 1 total (0 active), Execution time: mean = 0.05ms, total = 0.05ms, Queueing time: mean = 0.04ms, max = 0.04ms, min = 0.04ms, total = 0.04ms
|
| 199 |
+
DebugString() time ms: 1
|
test/events/event_AUTOSCALER.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1384.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1922.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1923.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1924.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1925.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1926.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1927.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1928.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_1929.log
ADDED
|
File without changes
|
test/events/event_CORE_WORKER_2410.log
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
{"custom_fields":{"worker_id":"a90c7d025be10c8b52ddb0c367136d57136cfc58ba507147879e9508"},"event_id":"7927057f1a3508400200893de0c9b7ed1726","host_name":"cs-01kje4289qf3k6pv20jzcef9t8","label":"RAY_FATAL_CHECK_FAILED","message":"src/ray/core_worker/task_execution/task_receiver.cc:132 (PID: 2410, TID: 2410, errno: 32 (Broken pipe)): An unexpected system state has occurred. You have likely discovered a bug in Ray. Please report this issue at https://github.com/ray-project/ray/issues and we'll work with you to fix it. Check failed: actor_creation_task_done_() Status not OK: IOError: Broken pipe \\n*** StackTrace Information ***\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0x16bc06a) [0x7cd486fab06a] ray::operator<<()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(_ZN3ray6RayLogD1Ev+0x488) [0x7cd486fad6b8] ray::RayLog::~RayLog()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0xb5fc1a) [0x7cd48644ec1a] ray::core::TaskReceiver::HandleTask()::{lambda()#1}::operator()()::{lambda()#1}::operator()()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0xb7b5b2) [0x7cd48646a5b2] ray::core::InboundRequest::Accept()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0xb6bc8b) [0x7cd48645ac8b] ray::core::NormalSchedulingQueue::ScheduleRequests()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0x1012fa8) [0x7cd486901fa8] EventTracker::RecordExecution()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0x1009f57) [0x7cd4868f8f57] std::_Function_handler<>::_M_invoke()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0xb8f31b) [0x7cd48647e31b] boost::asio::detail::executor_op<>::do_complete()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0x16873db) [0x7cd486f763db] boost::asio::detail::scheduler::do_run_one()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0x1688d79) [0x7cd486f77d79] boost::asio::detail::scheduler::run()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0x1689482) [0x7cd486f78482] boost::asio::io_context::run()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(_ZN3ray4core10CoreWorker20RunTaskExecutionLoopEv+0x127) [0x7cd4862fbaf7] ray::core::CoreWorker::RunTaskExecutionLoop()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(_ZN3ray4core21CoreWorkerProcessImpl26RunWorkerTaskExecutionLoopEv+0x41) [0x7cd486351461] ray::core::CoreWorkerProcessImpl::RunWorkerTaskExecutionLoop()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(_ZN3ray4core17CoreWorkerProcess20RunTaskExecutionLoopEv+0x1d) [0x7cd48635167d] ray::core::CoreWorkerProcess::RunTaskExecutionLoop()\\n/usr/local/lib/python3.12/dist-packages/ray/_raylet.so(+0x881e81) [0x7cd486170e81] __pyx_pw_3ray_7_raylet_10CoreWorker_5run_task_loop()\\nray::TaskRunner(PyObject_Vectorcall+0x36) [0x5627f6] PyObject_Vectorcall\\nray::TaskRunner(_PyEval_EvalFrameDefault+0x701) [0x54a2e1] _PyEval_EvalFrameDefault\\nray::TaskRunner(PyEval_EvalCode+0x99) [0x620799] PyEval_EvalCode\\nray::TaskRunner() [0x65c44b]\\nray::TaskRunner() [0x6574d6]\\nray::TaskRunner() [0x654145]\\nray::TaskRunner(_PyRun_SimpleFileObject+0x1a5) [0x653e15] _PyRun_SimpleFileObject\\nray::TaskRunner(_PyRun_AnyFileObject+0x47) [0x653927] _PyRun_AnyFileObject\\nray::TaskRunner(Py_RunMain+0x375) [0x650605] Py_RunMain\\nray::TaskRunner(Py_BytesMain+0x2d) [0x60962d] Py_BytesMain\\n/lib/x86_64-linux-gnu/libc.so.6(+0x29d90) [0x7cd48eca3d90]\\n/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0x80) [0x7cd48eca3e40] __libc_start_main\\nray::TaskRunner(_start+0x25) [0x6094a5] _start\\n","pid":"2410","severity":"FATAL","source_type":"CORE_WORKER","timestamp":1772150674}
|
test/events/event_GCS.log
ADDED
|
File without changes
|
test/events/event_RAYLET.log
ADDED
|
File without changes
|
test/export_events/event_EXPORT_ACTOR.log
ADDED
|
File without changes
|
test/export_events/event_EXPORT_DRIVER_JOB.log
ADDED
|
File without changes
|
test/export_events/event_EXPORT_NODE.log
ADDED
|
File without changes
|