Update README.md
Browse files
README.md
CHANGED
|
@@ -10,5 +10,65 @@ configs:
|
|
| 10 |
path: data/normalized/*.parquet
|
| 11 |
---
|
| 12 |
|
| 13 |
-
# FactoryNet Unified
|
| 14 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 10 |
path: data/normalized/*.parquet
|
| 11 |
---
|
| 12 |
|
| 13 |
+
# 🏭 FactoryNet: A Unified Multi-Machine Industrial Dataset
|
| 14 |
+
|
| 15 |
+
## Overview
|
| 16 |
+
FactoryNet is a large-scale, machine-learning-ready foundation dataset for industrial robotics and manufacturing anomaly detection. Historically, industrial datasets have been heavily siloed—every manufacturer and research team uses different column names, units, and structures.
|
| 17 |
+
|
| 18 |
+
FactoryNet solves this by forging massive, high-frequency physical datasets from completely different machines into a **single, mathematically unified coordinate system**. By standardizing axes, effort signals, and kinematic feedback, this dataset allows neural networks to learn universal physical relationships across different hardware boundaries.
|
| 19 |
+
|
| 20 |
+
## 📦 Dataset Composition
|
| 21 |
+
This repository currently contains millions of rows of high-frequency sensor data merged from three distinct open-source industrial datasets:
|
| 22 |
+
|
| 23 |
+
### 1. UMich CNC Mill Tool Wear Dataset
|
| 24 |
+
* **Machine:** 3-Axis CNC Mill
|
| 25 |
+
* **Task:** Machining wax blocks under varying feedrates and clamp pressures.
|
| 26 |
+
* **Anomalies:** Tool wear (Unworn vs. Worn) and visual inspection failures.
|
| 27 |
+
* **Original Source:** University of Michigan (via Kaggle)
|
| 28 |
+
|
| 29 |
+
### 2. AURSAD (Automated UR3e Screwdriving Anomaly Dataset)
|
| 30 |
+
* **Machine:** UR3e 6-Axis Collaborative Robot
|
| 31 |
+
* **Task:** Automated screwdriving using an OnRobot Screwdriver.
|
| 32 |
+
* **Anomalies:** Normal operation, damaged screws, missing screws, extra parts, and damaged threads.
|
| 33 |
+
* **Original Source:** Zenodo (Record 4487073)
|
| 34 |
+
|
| 35 |
+
### 3. voraus-AD (Yu-Cobot Pick-and-Place)
|
| 36 |
+
* **Machine:** Yu-Cobot 6-Axis Collaborative Robot
|
| 37 |
+
* **Task:** Industrial pick-and-place task on a conveyor belt.
|
| 38 |
+
* **Anomalies:** 12 diverse physical anomalies including axis wear (friction/miscommutation), gripping errors, collisions, and added axis weights.
|
| 39 |
+
* **Original Source:** voraus robotik (via Kaggle)
|
| 40 |
+
|
| 41 |
+
---
|
| 42 |
+
|
| 43 |
+
## 🏗️ The Unified Schema
|
| 44 |
+
To allow cross-machine learning, all raw variables (which previously had over 300 conflicting names) have been mapped to a standardized `FactoryNet` schema.
|
| 45 |
+
|
| 46 |
+
**Standardized Prefix Naming:**
|
| 47 |
+
* `setpoint_*`: The commanded target from the controller (e.g., `setpoint_pos_0`).
|
| 48 |
+
* `feedback_*`: The actual measured state from the sensors (e.g., `feedback_vel_1`).
|
| 49 |
+
* `effort_*`: The physical force/current applied (e.g., `effort_current_2`, `effort_torque_0`).
|
| 50 |
+
* `ctx_*`: Contextual metadata (e.g., `ctx_anomaly_label`, `ctx_busvoltage_0`).
|
| 51 |
+
|
| 52 |
+
**Standardized Axis Indexing:**
|
| 53 |
+
Regardless of how the original manufacturer numbered their joints (X/Y/Z or 1-6), all axes in this dataset are strictly zero-indexed (`0` through `5`).
|
| 54 |
+
|
| 55 |
+
---
|
| 56 |
+
|
| 57 |
+
## ⚙️ Configurations
|
| 58 |
+
This dataset is partitioned into highly compressed Parquet files and is available in two configurations:
|
| 59 |
+
|
| 60 |
+
1. **`raw`**: The original physical values (Amps, Volts, Radians, etc.) mapped directly into the new schema. Best for physics-informed neural networks or domain-specific thresholding.
|
| 61 |
+
2. **`normalized`**: All continuous physical variables have been independently standardized using a Z-score Scaler (`StandardScaler`) fitted specifically to that machine's domain. Best for immediate deep learning and foundation model training.
|
| 62 |
+
|
| 63 |
+
## 🚀 How to Use (Python)
|
| 64 |
+
Because this dataset is partitioned using Parquet, you can load the entire multi-gigabyte repository without crashing your local RAM.
|
| 65 |
+
|
| 66 |
+
```python
|
| 67 |
+
from datasets import load_dataset
|
| 68 |
+
|
| 69 |
+
# Load the mathematically normalized dataset for AI training
|
| 70 |
+
dataset = load_dataset("your_username/FactoryNet_Unified_Robot_Data", "normalized")
|
| 71 |
+
|
| 72 |
+
# Convert to a Pandas DataFrame
|
| 73 |
+
df = dataset['train'].to_pandas()
|
| 74 |
+
print(df['machine_type'].value_counts())
|