Lightcap commited on
Commit
a0bd4b2
·
verified ·
1 Parent(s): 4c173a3

Add small agent runtime telemetry dataset

Browse files
README.md ADDED
@@ -0,0 +1,104 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ pretty_name: Agent Runtime Telemetry Small
3
+ license: other
4
+ language:
5
+ - en
6
+ tags:
7
+ - agent-runtime
8
+ - agent-observability
9
+ - llm-observability
10
+ - mcp
11
+ - tool-calling
12
+ - runtime-telemetry
13
+ - audit-trail
14
+ - workflow-traces
15
+ - parquet
16
+ size_categories:
17
+ - 10K<n<100K
18
+ configs:
19
+ - config_name: dataset_overview
20
+ data_files:
21
+ - split: train
22
+ path: data/dataset_overview.parquet
23
+ - config_name: operations
24
+ data_files:
25
+ - split: train
26
+ path: data/operations.parquet
27
+ - config_name: operation_events
28
+ data_files:
29
+ - split: train
30
+ path: data/operation_events.parquet
31
+ - config_name: artifact_records
32
+ data_files:
33
+ - split: train
34
+ path: data/artifact_records.parquet
35
+ - config_name: audit_records
36
+ data_files:
37
+ - split: train
38
+ path: data/audit_records.parquet
39
+ - config_name: tool_summary
40
+ data_files:
41
+ - split: train
42
+ path: data/tool_summary.parquet
43
+ - config_name: artifact_summary
44
+ data_files:
45
+ - split: train
46
+ path: data/artifact_summary.parquet
47
+ - config_name: daily_activity
48
+ data_files:
49
+ - split: train
50
+ path: data/daily_activity.parquet
51
+ ---
52
+
53
+ # Agent Runtime Telemetry Small
54
+
55
+ Agent Runtime Telemetry Small is a compact tabular export of MCP-style agent execution telemetry. It is designed for dataset viewer inspection, lightweight agent observability experiments, tool-call reliability analysis, workflow trace summaries, and audit-trail research.
56
+
57
+ The dataset is intentionally small and row-oriented. Each table is stored as Parquet so the Hugging Face Dataset Viewer can display clean columns without requiring a SQLite client.
58
+
59
+ ## What It Contains
60
+
61
+ | Config | Rows | Columns | Purpose |
62
+ |---|---:|---:|---|
63
+ | `dataset_overview` | 7 | 6 | Table inventory and export policy |
64
+ | `operations` | 2,262 | 33 | Tool execution records, status, stages, durations, and summarized result metadata |
65
+ | `operation_events` | 9,903 | 13 | Lifecycle events for operations |
66
+ | `artifact_records` | 1,269 | 19 | Forecast, state-decode, and training artifact index records |
67
+ | `audit_records` | 14,053 | 17 | Tool request/result audit rows with compact metadata |
68
+ | `tool_summary` | 32 | 8 | Aggregated tool reliability and latency statistics |
69
+ | `artifact_summary` | 9 | 7 | Aggregated artifact status and payload-size statistics |
70
+ | `daily_activity` | 8 | 5 | UTC daily activity counts across runtime tables |
71
+
72
+ ## Privacy Boundary
73
+
74
+ This export does not upload the original SQLite databases and does not include raw nested `payload_json` bodies. Large JSON fields are represented with inspectable columns such as key lists, byte lengths, selected scalar status fields, and SHA-256 digests. Absolute local paths are reduced to path scope and file name columns.
75
+
76
+ ## Suggested Uses
77
+
78
+ - compare agent tool success/error rates across runtime traces
79
+ - inspect workflow latency and stage transitions
80
+ - prototype LLM agent observability dashboards
81
+ - analyze audit request/result volume without parsing full JSON logs
82
+ - benchmark small-data telemetry pipelines that expect clean tabular inputs
83
+
84
+ ## Loading Example
85
+
86
+ ```python
87
+ from datasets import load_dataset
88
+
89
+ ops = load_dataset("YOUR_USERNAME/agent-runtime-telemetry-small", "operations")
90
+ print(ops["train"][0])
91
+
92
+ summary = load_dataset("YOUR_USERNAME/agent-runtime-telemetry-small", "tool_summary")
93
+ print(summary["train"].to_pandas().sort_values("operation_count", ascending=False).head())
94
+ ```
95
+
96
+ ## Source
97
+
98
+ The rows were exported from local runtime SQLite stores into sanitized Parquet tables:
99
+
100
+ - `operation_state.sqlite3`
101
+ - `artifact_store.sqlite3`
102
+ - `audit_store.sqlite3`
103
+
104
+ The export focuses on the operational shape of agent runtimes rather than application-specific content.
UPLOAD_INSTRUCTIONS.md ADDED
@@ -0,0 +1,93 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Upload Instructions
2
+
3
+ Target dataset name:
4
+
5
+ ```text
6
+ agent-runtime-telemetry-small
7
+ ```
8
+
9
+ Recommended repo id:
10
+
11
+ ```text
12
+ YOUR_HF_USERNAME/agent-runtime-telemetry-small
13
+ ```
14
+
15
+ ## Files To Upload
16
+
17
+ Upload the full contents of this folder:
18
+
19
+ ```text
20
+ data/huggingface_exports/agent-runtime-telemetry-small/
21
+ ```
22
+
23
+ Do not upload the original runtime SQLite files. This folder already contains the viewer-friendly Parquet export and the Hugging Face dataset card.
24
+
25
+ ## CLI Upload
26
+
27
+ From the repository root:
28
+
29
+ ```bash
30
+ python3 - <<'PY'
31
+ from pathlib import Path
32
+ import os
33
+
34
+ from dotenv import load_dotenv
35
+ from huggingface_hub import HfApi, create_repo, upload_folder
36
+
37
+ root = Path("data/huggingface_exports/agent-runtime-telemetry-small").resolve()
38
+ load_dotenv(".env")
39
+
40
+ token = (
41
+ os.getenv("HF_TOKEN")
42
+ or os.getenv("HUGGINGFACE_HUB_TOKEN")
43
+ or os.getenv("HUGGING_FACE_HUB_TOKEN")
44
+ )
45
+ if not token:
46
+ raise SystemExit("Missing HF_TOKEN or HUGGINGFACE_HUB_TOKEN.")
47
+
48
+ api = HfApi(token=token)
49
+ username = api.whoami(token=token)["name"]
50
+ repo_id = f"{username}/agent-runtime-telemetry-small"
51
+
52
+ create_repo(repo_id=repo_id, repo_type="dataset", token=token, exist_ok=True, private=False)
53
+ upload_folder(
54
+ repo_id=repo_id,
55
+ repo_type="dataset",
56
+ folder_path=str(root),
57
+ token=token,
58
+ commit_message="Add small agent runtime telemetry dataset",
59
+ )
60
+
61
+ print(f"https://huggingface.co/datasets/{repo_id}")
62
+ PY
63
+ ```
64
+
65
+ ## Manual Web Upload
66
+
67
+ 1. Create a new Hugging Face Dataset named `agent-runtime-telemetry-small`.
68
+ 2. Upload `README.md`, `export_manifest.json`, and the full `data/` directory from this folder.
69
+ 3. Wait for the Dataset Viewer to process the Parquet files.
70
+ 4. Confirm the configs appear as `operations`, `operation_events`, `artifact_records`, `audit_records`, `tool_summary`, `artifact_summary`, `daily_activity`, and `dataset_overview`.
71
+
72
+ ## Validation After Upload
73
+
74
+ ```python
75
+ from datasets import load_dataset
76
+
77
+ repo_id = "YOUR_HF_USERNAME/agent-runtime-telemetry-small"
78
+ for config in ["operations", "operation_events", "artifact_records", "audit_records", "tool_summary"]:
79
+ ds = load_dataset(repo_id, config)
80
+ print(config, ds["train"].num_rows, ds["train"].column_names[:8])
81
+ ```
82
+
83
+ ## Export Policy
84
+
85
+ This folder is a sanitized export:
86
+
87
+ - no source SQLite database files
88
+ - no raw nested `payload_json` bodies
89
+ - no absolute local paths
90
+ - no secret-like token strings
91
+ - Parquet tables optimized for Hugging Face Dataset Viewer
92
+
93
+ If you regenerate from newer runtime state, keep the same policy so the dataset remains useful and safe to browse.
data/artifact_records.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a1e02f24eb11596dd12ca2e59f4be88b24872b1727c1b96bdc8463b268c1dd92
3
+ size 103040
data/artifact_summary.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a7e98cf8a1de3ad840e8e04a275ac1cf9454e29278e1e08d9e72465e812d1017
3
+ size 5806
data/audit_records.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:44c223bf1fdb2817ce461c2cbf650f31723fe1a7008d41d7686535eff87f5f9d
3
+ size 2247227
data/daily_activity.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:616e42658d086cf811ed2b72673154748842d33f4bd78cbe30cc61bc4195877e
3
+ size 3827
data/dataset_overview.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3e8cb25a66e296926875fc6d16bd9fd9ff3659ddf3566e3036e63495be2529c5
3
+ size 4796
data/operation_events.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4db53085f861baff539e12c1a238dc82ec840d881f7b6ca02c488f3d70e96584
3
+ size 610446
data/operations.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9fb8a1b19d5ae1f439fa8c3a09a45e8965801fd85937ef424788467dc7700655
3
+ size 276568
data/tool_summary.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eb012f2611ac6d4d45bcffdac21c13d785f7298129ea9bd288b65d7603529f58
3
+ size 8030
export_manifest.json ADDED
@@ -0,0 +1,200 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "name": "agent-runtime-telemetry-small",
3
+ "created_at_utc": "2026-04-21T15:04:52.808915+00:00",
4
+ "source_runtime_databases": {
5
+ "operation_state": {
6
+ "path": "data/runtime_state/operation_state.sqlite3",
7
+ "size_bytes": 201281536
8
+ },
9
+ "artifact_store": {
10
+ "path": "data/runtime_state/artifact_store.sqlite3",
11
+ "size_bytes": 43900928
12
+ },
13
+ "audit_store": {
14
+ "path": "data/runtime_state/audit_store.sqlite3",
15
+ "size_bytes": 191569920
16
+ }
17
+ },
18
+ "export_root": "data/huggingface_exports/agent-runtime-telemetry-small",
19
+ "tables": {
20
+ "operations": {
21
+ "rows": 2262,
22
+ "columns": [
23
+ "operation_id",
24
+ "request_id",
25
+ "tool_name",
26
+ "status",
27
+ "stage",
28
+ "duration_ms",
29
+ "created_at_utc",
30
+ "updated_at_utc",
31
+ "args_fingerprint",
32
+ "args_count",
33
+ "args_keys",
34
+ "kwargs_key_count",
35
+ "kwargs_keys",
36
+ "operation_mode",
37
+ "backend_preference",
38
+ "force_retrain",
39
+ "include_control_sensitivities",
40
+ "include_validation_protocols",
41
+ "has_input_provenance",
42
+ "has_source_binding",
43
+ "series_rows_count",
44
+ "scenario_rows_count",
45
+ "result_summary_key_count",
46
+ "result_summary_keys",
47
+ "result_type",
48
+ "result_operation",
49
+ "result_payload_key_count",
50
+ "result_payload_keys",
51
+ "result_payload_bytes",
52
+ "artifacts_bytes",
53
+ "error_type",
54
+ "error_message_preview",
55
+ "error_message_sha256"
56
+ ],
57
+ "column_count": 33,
58
+ "file": "data/operations.parquet",
59
+ "size_bytes": 276568
60
+ },
61
+ "operation_events": {
62
+ "rows": 9903,
63
+ "columns": [
64
+ "event_id",
65
+ "operation_id",
66
+ "event_type",
67
+ "status",
68
+ "stage",
69
+ "event_time_utc",
70
+ "payload_bytes",
71
+ "payload_sha256",
72
+ "payload_key_count",
73
+ "payload_keys",
74
+ "payload_status",
75
+ "payload_stage",
76
+ "payload_tool"
77
+ ],
78
+ "column_count": 13,
79
+ "file": "data/operation_events.parquet",
80
+ "size_bytes": 610446
81
+ },
82
+ "artifact_records": {
83
+ "rows": 1269,
84
+ "columns": [
85
+ "record_id",
86
+ "artifact_kind",
87
+ "artifact_file",
88
+ "artifact_path_scope",
89
+ "config_tag",
90
+ "schema_name",
91
+ "status",
92
+ "payload_sha256",
93
+ "metadata_key_count",
94
+ "metadata_keys",
95
+ "metadata_status",
96
+ "metadata_source_mode",
97
+ "metadata_state_count",
98
+ "payload_key_count",
99
+ "payload_keys",
100
+ "payload_bytes",
101
+ "has_forecast",
102
+ "state_count",
103
+ "recorded_at_utc"
104
+ ],
105
+ "column_count": 19,
106
+ "file": "data/artifact_records.parquet",
107
+ "size_bytes": 103040
108
+ },
109
+ "audit_records": {
110
+ "rows": 14053,
111
+ "columns": [
112
+ "record_id",
113
+ "category",
114
+ "record_name",
115
+ "record_file",
116
+ "record_path_scope",
117
+ "tool",
118
+ "kind",
119
+ "status",
120
+ "duration_ms",
121
+ "request_id",
122
+ "payload_bytes",
123
+ "payload_sha256",
124
+ "payload_key_count",
125
+ "payload_keys",
126
+ "response_key_count",
127
+ "response_keys",
128
+ "created_at_utc"
129
+ ],
130
+ "column_count": 17,
131
+ "file": "data/audit_records.parquet",
132
+ "size_bytes": 2247227
133
+ },
134
+ "tool_summary": {
135
+ "rows": 32,
136
+ "columns": [
137
+ "tool_name",
138
+ "status",
139
+ "operation_count",
140
+ "avg_duration_ms",
141
+ "median_duration_ms",
142
+ "p95_duration_ms",
143
+ "first_seen_utc",
144
+ "last_seen_utc"
145
+ ],
146
+ "column_count": 8,
147
+ "file": "data/tool_summary.parquet",
148
+ "size_bytes": 8030
149
+ },
150
+ "artifact_summary": {
151
+ "rows": 9,
152
+ "columns": [
153
+ "artifact_kind",
154
+ "status",
155
+ "artifact_count",
156
+ "avg_payload_bytes",
157
+ "median_payload_bytes",
158
+ "first_recorded_utc",
159
+ "last_recorded_utc"
160
+ ],
161
+ "column_count": 7,
162
+ "file": "data/artifact_summary.parquet",
163
+ "size_bytes": 5806
164
+ },
165
+ "daily_activity": {
166
+ "rows": 8,
167
+ "columns": [
168
+ "date_utc",
169
+ "operations",
170
+ "operation_events",
171
+ "artifact_records",
172
+ "audit_records"
173
+ ],
174
+ "column_count": 5,
175
+ "file": "data/daily_activity.parquet",
176
+ "size_bytes": 3827
177
+ },
178
+ "dataset_overview": {
179
+ "rows": 7,
180
+ "columns": [
181
+ "table_name",
182
+ "row_count",
183
+ "column_count",
184
+ "file",
185
+ "viewer_role",
186
+ "payload_policy"
187
+ ],
188
+ "column_count": 6,
189
+ "file": "data/dataset_overview.parquet",
190
+ "size_bytes": 4796
191
+ }
192
+ },
193
+ "privacy_boundary": {
194
+ "raw_payload_json_uploaded": false,
195
+ "absolute_paths_uploaded": false,
196
+ "sqlite_databases_uploaded": false,
197
+ "secret_like_text_redaction": true,
198
+ "notes": "Export contains operational telemetry summaries derived from local runtime SQLite tables; large nested JSON payloads are represented by keys, byte lengths, selected scalar statuses, and SHA-256 digests."
199
+ }
200
+ }