timlawrenz commited on
Commit
714c354
·
verified ·
1 Parent(s): ef66e8f

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +111 -12
README.md CHANGED
@@ -1,23 +1,122 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  # GNN Ruby Code Study
2
 
3
  Systematic study of Graph Neural Network architectures for Ruby code complexity prediction and generation.
4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  ## Key Findings
6
 
7
- 1. **5-layer GraphSAGE** achieves MAE 4.018 (R^2 = 0.709) for complexity prediction - 16% better than 3-layer baseline
8
- 2. **GNN autoencoders produce 0% valid Ruby** across all tested architectures and loss functions
9
- 3. **Teacher-forced GIN decoder** achieves 7% validity - the only non-zero result in 35 experiments
10
- 4. **Total cost: ~$4.20** on Vast.ai RTX 4090 instances
 
 
 
11
 
12
- ## Contents
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
 
14
- - `paper.md` - Full research paper
15
- - `results/experiments.json` - All fleet experiment metrics
16
- - `results/autonomous_research.json` - 18 baseline variance replicates
17
- - `experiments/` - Fleet YAML specifications for all 3 tracks
18
- - `specs/` - Ratiocinator ResearchSpec YAMLs
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19
 
20
  ## Source Code
21
 
22
- - Model code: [jubilant-palm-tree](https://github.com/timlawrenz/jubilant-palm-tree) (branch: `experiment/ratiocinator-gnn-study`)
23
- - Orchestrator: [ratiocinator](https://github.com/timlawrenz/ratiocinator)
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - code
4
+ license: mit
5
+ task_categories:
6
+ - graph-ml
7
+ - text-classification
8
+ tags:
9
+ - code
10
+ - ast
11
+ - gnn
12
+ - graph-neural-network
13
+ - ruby
14
+ - complexity-prediction
15
+ - code-generation
16
+ - negative-results
17
+ size_categories:
18
+ - 10K<n<100K
19
+ ---
20
+
21
  # GNN Ruby Code Study
22
 
23
  Systematic study of Graph Neural Network architectures for Ruby code complexity prediction and generation.
24
 
25
+ **Paper:** [Graph Neural Networks for Ruby Code Complexity Prediction and Generation: A Systematic Architecture Study](paper.md)
26
+
27
+ ## Dataset
28
+
29
+ **22,452 Ruby methods** parsed into AST graphs with 74-dimensional node features.
30
+
31
+ | Split | Samples | File |
32
+ |-------|---------|------|
33
+ | Train | 19,084 | `dataset/train.jsonl` |
34
+ | Validation | 3,368 | `dataset/val.jsonl` |
35
+
36
+ Each JSONL record contains:
37
+ - `repo_name`: Source repository
38
+ - `file_path`: Original file path
39
+ - `raw_source`: Raw Ruby source code
40
+ - `complexity_score`: McCabe cyclomatic complexity
41
+ - `ast_json`: Full AST as nested JSON (node types + literal values)
42
+ - `id`: Unique identifier
43
+
44
+ ### Node Features (74D)
45
+ - One-hot encoding of 73 AST node types (def, send, args, lvar, str, ...) + 1 unknown
46
+ - Types cover Ruby AST nodes; literal values (identifiers, strings, numbers) map to unknown
47
+
48
  ## Key Findings
49
 
50
+ 1. **5-layer GraphSAGE** achieves MAE 4.018 (R² = 0.709) for complexity prediction 16% better than 3-layer baseline (9.9σ significant)
51
+ 2. **GNN autoencoders produce 0% valid Ruby** across all 15+ tested configurations
52
+ 3. **The literal value bottleneck**: Teacher-forced GIN achieves 81% node type accuracy and 99.5% type diversity, but 0% syntax validity because 47% of AST elements are literals with no learnable representation
53
+ 4. **Chain decoders collapse**: 93% of predictions default to UNKNOWN without structural supervision
54
+ 5. **Total cost: ~$4.32** across 51 GPU experiments on Vast.ai RTX 4090 + local RTX 2070 SUPER
55
+
56
+ ## Repository Structure
57
 
58
+ ```
59
+ ├── paper.md # Full research paper
60
+ ├── dataset/
61
+ │ ├── train.jsonl # 19,084 Ruby methods (37 MB)
62
+ │ └── val.jsonl # 3,368 Ruby methods (6.5 MB)
63
+ ├── models/
64
+ │ ├── encoder_sage_5layer.pt # Pre-trained SAGE encoder
65
+ │ └── decoders/ # Trained decoder checkpoints
66
+ │ ├── tf-gin-256-deep.pt # Best: teacher-forced GIN, 5 layers
67
+ │ ├── tf-gin-{128,256,512}.pt # Dimension ablation
68
+ │ └── chain-gin-256.pt # Control (no structural supervision)
69
+ ├── results/
70
+ │ ├── fleet_experiments.json # All Vast.ai experiment metrics
71
+ │ ├── autonomous_research.json # 18 baseline variance replicates
72
+ │ └── gin_deep_dive/ # Local deep-dive analysis
73
+ │ ├── summary.json # Ablation summary table
74
+ │ └── *_results.json # Per-config detailed results
75
+ ├── experiments/ # Ratiocinator fleet YAML specs
76
+ ├── specs/ # Ratiocinator research YAML specs
77
+ ├── src/ # Model source code
78
+ │ ├── models.py # GNN architectures
79
+ │ ├── data_processing.py # AST→graph pipeline
80
+ │ ├── loss.py # Loss functions
81
+ │ ├── train.py # Complexity prediction trainer
82
+ │ └── train_autoencoder.py # Autoencoder trainer
83
+ └── scripts/ # Runner and evaluation scripts
84
+ ```
85
 
86
+ ## Reproducing Results
87
+
88
+ ```bash
89
+ # Clone the experiment branch
90
+ git clone -b experiment/ratiocinator-gnn-study https://github.com/timlawrenz/jubilant-palm-tree
91
+ cd jubilant-palm-tree
92
+
93
+ # Install dependencies
94
+ python -m venv .venv && source .venv/bin/activate
95
+ pip install torch torchvision torch_geometric
96
+
97
+ # Train complexity prediction (Track 1)
98
+ python train.py --conv_type SAGE --num_layers 5 --epochs 50
99
+
100
+ # Train autoencoder with teacher-forced GIN decoder (Track 4)
101
+ python train_autoencoder.py --decoder_conv_type GIN --decoder_edge_mode teacher_forced --epochs 30
102
+
103
+ # Run the full deep-dive ablation
104
+ python scripts/gin_deep_dive.py
105
+ ```
106
 
107
  ## Source Code
108
 
109
+ - **Model code:** [jubilant-palm-tree](https://github.com/timlawrenz/jubilant-palm-tree) (branch: `experiment/ratiocinator-gnn-study`)
110
+ - **Orchestrator:** [ratiocinator](https://github.com/timlawrenz/ratiocinator)
111
+
112
+ ## Citation
113
+
114
+ If you use this dataset or findings, please cite:
115
+ ```
116
+ @misc{lawrenz2025gnnruby,
117
+ title={Graph Neural Networks for Ruby Code Complexity Prediction and Generation: A Systematic Architecture Study},
118
+ author={Tim Lawrenz},
119
+ year={2025},
120
+ howpublished={\url{https://huggingface.co/datasets/timlawrenz/gnn-ruby-code-study}}
121
+ }
122
+ ```