Upload folder using huggingface_hub
Browse files- README.md +253 -0
- geoforce_cnn_v1.1.pt +3 -0
- hyperparams.yaml +29 -0
- reservoir_cnn.py +302 -0
- technical-report.md +412 -0
- training_log.json +2043 -0
- validation_metrics_v1.1.json +52 -0
README.md
ADDED
|
@@ -0,0 +1,253 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
tags:
|
| 6 |
+
- geothermal
|
| 7 |
+
- reservoir-simulation
|
| 8 |
+
- physics-informed
|
| 9 |
+
- surrogate-model
|
| 10 |
+
- energy
|
| 11 |
+
- cnn
|
| 12 |
+
- pytorch
|
| 13 |
+
- geoscience
|
| 14 |
+
- indonesia
|
| 15 |
+
library_name: pytorch
|
| 16 |
+
pipeline_tag: other
|
| 17 |
+
metrics:
|
| 18 |
+
- r_squared
|
| 19 |
+
- mse
|
| 20 |
+
datasets:
|
| 21 |
+
- ForceX-AI/geothermal-reservoir-dataset
|
| 22 |
+
model-index:
|
| 23 |
+
- name: GeoForce-CNN-v1.1
|
| 24 |
+
results:
|
| 25 |
+
- task:
|
| 26 |
+
type: other
|
| 27 |
+
name: Geothermal Reservoir Prediction
|
| 28 |
+
metrics:
|
| 29 |
+
- name: Temperature RMSE
|
| 30 |
+
type: rmse
|
| 31 |
+
value: 3.31
|
| 32 |
+
- name: Temperature R²
|
| 33 |
+
type: r_squared
|
| 34 |
+
value: 0.994
|
| 35 |
+
- name: Pressure RMSE (MPa)
|
| 36 |
+
type: rmse
|
| 37 |
+
value: 0.354
|
| 38 |
+
- name: Pressure R²
|
| 39 |
+
type: r_squared
|
| 40 |
+
value: 0.997
|
| 41 |
+
- name: Inference Time (ms)
|
| 42 |
+
type: latency
|
| 43 |
+
value: 3.19
|
| 44 |
+
- name: Speedup vs Simulator
|
| 45 |
+
type: speedup
|
| 46 |
+
value: 282350
|
| 47 |
+
---
|
| 48 |
+
|
| 49 |
+
# GeoForce-CNN v1.1: Physics-Informed CNN Surrogate for Geothermal Reservoir Prediction
|
| 50 |
+
|
| 51 |
+
**GeoForce** is a physics-informed convolutional neural network (CNN) that predicts temperature and pressure field evolution in geothermal reservoirs. It replaces computationally expensive numerical simulators with a compact 57,802-parameter model that runs in **3.2ms on CPU** — a **282,350x speedup** over finite-difference simulation.
|
| 52 |
+
|
| 53 |
+
<div align="center">
|
| 54 |
+
|
| 55 |
+
| Metric | Value |
|
| 56 |
+
|--------|-------|
|
| 57 |
+
| Temperature RMSE | **3.31°C** |
|
| 58 |
+
| Temperature R² | **0.994** |
|
| 59 |
+
| Pressure RMSE | **0.354 MPa** |
|
| 60 |
+
| Pressure R² | **0.997** |
|
| 61 |
+
| Physics Violations | **0.0%** |
|
| 62 |
+
| Inference Time | **3.19 ms** (CPU) |
|
| 63 |
+
| Speedup vs Simulator | **282,350x** |
|
| 64 |
+
| Parameters | **57,802** |
|
| 65 |
+
|
| 66 |
+
</div>
|
| 67 |
+
|
| 68 |
+
## Model Description
|
| 69 |
+
|
| 70 |
+
GeoForce maps static reservoir properties (temperature, permeability, porosity, depth, well locations, pressure) directly to spatiotemporal temperature and pressure fields over a 20-year production horizon. It was designed for rapid uncertainty quantification and well placement optimization in Indonesian geothermal systems.
|
| 71 |
+
|
| 72 |
+
### Architecture
|
| 73 |
+
|
| 74 |
+
Encoder-Residual-Decoder CNN:
|
| 75 |
+
|
| 76 |
+
```
|
| 77 |
+
Input (batch, 6, 32, 32)
|
| 78 |
+
↓
|
| 79 |
+
[Encoder]
|
| 80 |
+
ConvBlock: Conv2d(6→32, 3×3) + BatchNorm + ReLU
|
| 81 |
+
ConvBlock: Conv2d(32→32, 3×3) + BatchNorm + ReLU
|
| 82 |
+
↓
|
| 83 |
+
[Middle]
|
| 84 |
+
ResidualBlock: Conv→BN→ReLU→Conv→BN + skip → ReLU
|
| 85 |
+
ResidualBlock: Conv→BN→ReLU→Conv→BN + skip → ReLU
|
| 86 |
+
↓
|
| 87 |
+
[Decoder]
|
| 88 |
+
ConvBlock: Conv2d(32→32, 3×3) + BatchNorm + ReLU
|
| 89 |
+
Conv2d(32→10, 1×1) + Sigmoid
|
| 90 |
+
↓
|
| 91 |
+
Output (batch, 10, 32, 32)
|
| 92 |
+
→ Channels 0-4: Temperature at years 4, 8, 12, 16, 20
|
| 93 |
+
→ Channels 5-9: Pressure at years 4, 8, 12, 16, 20
|
| 94 |
+
```
|
| 95 |
+
|
| 96 |
+
### Input Channels (6)
|
| 97 |
+
|
| 98 |
+
| Channel | Content | Normalization |
|
| 99 |
+
|---------|---------|---------------|
|
| 100 |
+
| 0 | Initial temperature field (°C) | (T - 25) / 325 |
|
| 101 |
+
| 1 | Log₁₀ permeability (m²) | (log_k + 16) / 4 |
|
| 102 |
+
| 2 | Well mask (Gaussian decay) | [0, 1] |
|
| 103 |
+
| 3 | Base pressure (Pa) | (P - 5e6) / 20e6 |
|
| 104 |
+
| 4 | Porosity | (φ - 0.01) / 0.14 |
|
| 105 |
+
| 5 | Depth (m) | (z - 800) / 1700 |
|
| 106 |
+
|
| 107 |
+
### Physics-Informed Loss
|
| 108 |
+
|
| 109 |
+
The training loss combines data-driven MSE with three physics constraints:
|
| 110 |
+
|
| 111 |
+
```
|
| 112 |
+
L_total = L_data + 0.1 × L_physics
|
| 113 |
+
L_physics = 0.1 × L_temporal + 0.01 × L_spatial + 0.001 × L_darcy
|
| 114 |
+
```
|
| 115 |
+
|
| 116 |
+
- **Temporal smoothness**: Penalizes non-physical jumps between timesteps (energy conservation proxy)
|
| 117 |
+
- **Spatial smoothness**: Discrete Laplacian penalty enforcing diffusion behavior
|
| 118 |
+
- **Darcy coupling**: Constrains pressure gradients in high-permeability zones
|
| 119 |
+
|
| 120 |
+
## Training
|
| 121 |
+
|
| 122 |
+
- **Data**: 1,000 synthetic reservoir simulations via implicit finite-difference solver (coupled heat + Darcy flow)
|
| 123 |
+
- **Parameter ranges**: Calibrated to Indonesian geothermal fields (Kamojang, Wayang Windu, Darajat)
|
| 124 |
+
- **Split**: 800 train / 100 val / 100 test (seed=42)
|
| 125 |
+
- **Optimizer**: Adam (lr=1e-3, ReduceLROnPlateau)
|
| 126 |
+
- **Training time**: 44.9 minutes on CPU (no GPU required)
|
| 127 |
+
- **Best epoch**: 352 / 402 (early stopped, patience=50)
|
| 128 |
+
- **Best validation loss**: 0.000382
|
| 129 |
+
|
| 130 |
+
## Quick Start
|
| 131 |
+
|
| 132 |
+
```python
|
| 133 |
+
import torch
|
| 134 |
+
import numpy as np
|
| 135 |
+
from reservoir_cnn import ReservoirCNN
|
| 136 |
+
|
| 137 |
+
# Load model
|
| 138 |
+
model = ReservoirCNN(in_channels=6, n_time_steps=5, base_filters=32)
|
| 139 |
+
checkpoint = torch.load("geoforce_cnn_v1.1.pt", map_location="cpu", weights_only=False)
|
| 140 |
+
model.load_state_dict(checkpoint["model_state_dict"])
|
| 141 |
+
model.eval()
|
| 142 |
+
|
| 143 |
+
# Prepare input: 6-channel tensor on 32×32 grid
|
| 144 |
+
# Example: Kamojang-like reservoir
|
| 145 |
+
base_temp = 245.0 # °C
|
| 146 |
+
log_perm = -14.0 # log10(m²)
|
| 147 |
+
base_pressure = 15e6 # Pa (15 MPa)
|
| 148 |
+
porosity = 0.08
|
| 149 |
+
depth = 1500.0 # m
|
| 150 |
+
n_wells = 3
|
| 151 |
+
|
| 152 |
+
# Normalize inputs
|
| 153 |
+
inp = torch.zeros(1, 6, 32, 32)
|
| 154 |
+
inp[0, 0] = (base_temp - 25) / 325 # Temperature
|
| 155 |
+
inp[0, 1] = (log_perm + 16) / 4 # Permeability
|
| 156 |
+
# Channel 2: well mask (place wells with Gaussian decay)
|
| 157 |
+
for _ in range(n_wells):
|
| 158 |
+
wx, wy = np.random.randint(4, 28, size=2)
|
| 159 |
+
for i in range(32):
|
| 160 |
+
for j in range(32):
|
| 161 |
+
d = ((i - wy)**2 + (j - wx)**2) / 50.0
|
| 162 |
+
inp[0, 2, i, j] = max(inp[0, 2, i, j].item(), np.exp(-d))
|
| 163 |
+
inp[0, 3] = (base_pressure - 5e6) / 20e6 # Pressure
|
| 164 |
+
inp[0, 4] = (porosity - 0.01) / 0.14 # Porosity
|
| 165 |
+
inp[0, 5] = (depth - 800) / 1700 # Depth
|
| 166 |
+
|
| 167 |
+
# Run inference
|
| 168 |
+
with torch.no_grad():
|
| 169 |
+
output = model(inp) # (1, 10, 32, 32) normalized [0,1]
|
| 170 |
+
|
| 171 |
+
# Denormalize to physical units
|
| 172 |
+
temperature = output[0, :5] * 350.0 # °C, shape (5, 32, 32)
|
| 173 |
+
pressure = output[0, 5:] * 50e6 # Pa, shape (5, 32, 32)
|
| 174 |
+
|
| 175 |
+
print(f"Year 20 avg temperature: {temperature[4].mean():.1f}°C")
|
| 176 |
+
print(f"Year 20 avg pressure: {pressure[4].mean()/1e6:.2f} MPa")
|
| 177 |
+
```
|
| 178 |
+
|
| 179 |
+
## Validation Results
|
| 180 |
+
|
| 181 |
+
### Per-Timestep Accuracy
|
| 182 |
+
|
| 183 |
+
| Year | Temp RMSE (°C) | Pressure RMSE (MPa) |
|
| 184 |
+
|------|---------------|---------------------|
|
| 185 |
+
| 4 | 4.025 | 0.356 |
|
| 186 |
+
| 8 | 3.262 | 0.364 |
|
| 187 |
+
| 12 | 3.333 | 0.326 |
|
| 188 |
+
| 16 | 3.005 | 0.332 |
|
| 189 |
+
| 20 | 2.811 | 0.390 |
|
| 190 |
+
|
| 191 |
+
### Benchmark Comparison
|
| 192 |
+
|
| 193 |
+
| Method | Temp Accuracy | Inference | Hardware |
|
| 194 |
+
|--------|--------------|-----------|----------|
|
| 195 |
+
| TOUGH2 (full physics) | Exact | 15-30 min | CPU |
|
| 196 |
+
| USGS Brady Hot Springs ML | < 3.68% relative | 0.9 s | GPU |
|
| 197 |
+
| Fourier Neural Operator | ~1-3% error | ~10 ms | GPU required |
|
| 198 |
+
| **GeoForce v1.1** | **3.31°C RMSE (1.3%)** | **3.2 ms** | **CPU only** |
|
| 199 |
+
|
| 200 |
+
## Version History
|
| 201 |
+
|
| 202 |
+
| Version | Params | Temp R² | Pressure R² | Status |
|
| 203 |
+
|---------|--------|---------|-------------|--------|
|
| 204 |
+
| v1.0 | 56,938 | 0.975 | -0.003 | Retired (pressure prediction failed) |
|
| 205 |
+
| **v1.1** | **57,802** | **0.994** | **0.997** | **Current — all thresholds pass** |
|
| 206 |
+
|
| 207 |
+
The v1.0 failure was caused by compressing three physical parameters (pressure, porosity, depth) into a single input channel. Each parameter now has its own dedicated channel in v1.1. Full failure analysis is documented in the [technical report](technical-report.md).
|
| 208 |
+
|
| 209 |
+
## Limitations
|
| 210 |
+
|
| 211 |
+
1. **2D grid only** (32×32 horizontal slice) — no vertical flow or gravity convection
|
| 212 |
+
2. **Single-phase** liquid water — no steam-water phase transitions
|
| 213 |
+
3. **Homogeneous permeability** per scenario — no fracture networks
|
| 214 |
+
4. **Synthetic training data** — not yet validated against real field measurements
|
| 215 |
+
5. **Max error of 47.3°C** in extreme edge cases near training distribution boundaries
|
| 216 |
+
|
| 217 |
+
## Use Cases
|
| 218 |
+
|
| 219 |
+
- **Uncertainty quantification**: Run 10,000+ Monte Carlo samples in < 1 minute
|
| 220 |
+
- **Well placement optimization**: Evaluate hundreds of configurations in seconds
|
| 221 |
+
- **Screening studies**: Rapidly assess reservoir viability before committing to full simulation
|
| 222 |
+
- **Education**: Demonstrate reservoir physics concepts with instant feedback
|
| 223 |
+
|
| 224 |
+
## Files
|
| 225 |
+
|
| 226 |
+
| File | Description |
|
| 227 |
+
|------|-------------|
|
| 228 |
+
| `geoforce_cnn_v1.1.pt` | PyTorch checkpoint (244 KB) |
|
| 229 |
+
| `reservoir_cnn.py` | Model architecture (ReservoirCNN class) |
|
| 230 |
+
| `validation_metrics_v1.1.json` | Full validation results on 100 test samples |
|
| 231 |
+
| `hyperparams.yaml` | Training configuration |
|
| 232 |
+
| `training_log.json` | Complete training history (402 epochs) |
|
| 233 |
+
| `technical-report.md` | Full technical report with methodology and failure analysis |
|
| 234 |
+
|
| 235 |
+
## Citation
|
| 236 |
+
|
| 237 |
+
```bibtex
|
| 238 |
+
@software{geoforce2026,
|
| 239 |
+
title={GeoForce: Physics-Informed CNN Surrogate for Geothermal Reservoir Prediction},
|
| 240 |
+
author={Riupassa, Robi Dany},
|
| 241 |
+
year={2026},
|
| 242 |
+
publisher={ForceX AI},
|
| 243 |
+
url={https://huggingface.co/ForceX-AI/GeoForce-CNN-v1.1},
|
| 244 |
+
version={1.1.0}
|
| 245 |
+
}
|
| 246 |
+
```
|
| 247 |
+
|
| 248 |
+
## About ForceX AI
|
| 249 |
+
|
| 250 |
+
ForceX AI builds AI-powered tools for the energy industry — geothermal, renewable, oil & gas, and nuclear. Our models are trained on real simulation data and validated against industry benchmarks.
|
| 251 |
+
|
| 252 |
+
- Platform: [platform.forcex-ai.com](https://platform.forcex-ai.com)
|
| 253 |
+
- Website: [forcex-ai.com](https://forcex-ai.com)
|
geoforce_cnn_v1.1.pt
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:3701d0ea2bcb4d8a0d68eb8e8ed1d79b3bb010a8396227e4dbddbc4327746386
|
| 3 |
+
size 248595
|
hyperparams.yaml
ADDED
|
@@ -0,0 +1,29 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# GeoForce CNN Surrogate — Training Hyperparameters
|
| 2 |
+
|
| 3 |
+
model:
|
| 4 |
+
architecture: "ReservoirCNN"
|
| 5 |
+
params: 56000
|
| 6 |
+
grid_size: 32
|
| 7 |
+
in_channels: 5 # T, P, permeability, porosity, depth
|
| 8 |
+
out_channels: 2 # T_predicted, P_predicted
|
| 9 |
+
n_timesteps: 5
|
| 10 |
+
|
| 11 |
+
training:
|
| 12 |
+
epochs: 500
|
| 13 |
+
learning_rate: 1.0e-3
|
| 14 |
+
batch_size: 32
|
| 15 |
+
optimizer: "Adam"
|
| 16 |
+
scheduler: "ReduceLROnPlateau"
|
| 17 |
+
physics_weight: 0.1
|
| 18 |
+
early_stopping_patience: 50
|
| 19 |
+
|
| 20 |
+
data:
|
| 21 |
+
source: "tough2_simulations"
|
| 22 |
+
train_split: 0.8
|
| 23 |
+
val_split: 0.1
|
| 24 |
+
test_split: 0.1
|
| 25 |
+
|
| 26 |
+
compute:
|
| 27 |
+
device: "cpu"
|
| 28 |
+
estimated_time: "30 minutes"
|
| 29 |
+
platform: "vps"
|
reservoir_cnn.py
ADDED
|
@@ -0,0 +1,302 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""CNN surrogate model for geothermal reservoir prediction.
|
| 2 |
+
|
| 3 |
+
Takes reservoir parameters (permeability field, well locations, boundary conditions)
|
| 4 |
+
as input and predicts temperature and pressure fields at multiple time steps.
|
| 5 |
+
|
| 6 |
+
Extends PINNBase from shared infrastructure with geothermal-specific physics loss
|
| 7 |
+
(energy conservation, Darcy's law residual).
|
| 8 |
+
"""
|
| 9 |
+
|
| 10 |
+
from __future__ import annotations
|
| 11 |
+
|
| 12 |
+
import logging
|
| 13 |
+
from typing import Any
|
| 14 |
+
|
| 15 |
+
import torch
|
| 16 |
+
import torch.nn as nn
|
| 17 |
+
import torch.nn.functional as F
|
| 18 |
+
|
| 19 |
+
logger = logging.getLogger(__name__)
|
| 20 |
+
|
| 21 |
+
|
| 22 |
+
class ConvBlock(nn.Module):
|
| 23 |
+
"""Convolutional block: Conv2d -> BatchNorm -> ReLU."""
|
| 24 |
+
|
| 25 |
+
def __init__(self, in_channels: int, out_channels: int, kernel_size: int = 3) -> None:
|
| 26 |
+
super().__init__()
|
| 27 |
+
self.conv = nn.Conv2d(
|
| 28 |
+
in_channels, out_channels, kernel_size,
|
| 29 |
+
padding=kernel_size // 2, bias=False,
|
| 30 |
+
)
|
| 31 |
+
self.bn = nn.BatchNorm2d(out_channels)
|
| 32 |
+
self.relu = nn.ReLU(inplace=True)
|
| 33 |
+
|
| 34 |
+
def forward(self, x: torch.Tensor) -> torch.Tensor:
|
| 35 |
+
return self.relu(self.bn(self.conv(x)))
|
| 36 |
+
|
| 37 |
+
|
| 38 |
+
class ResidualBlock(nn.Module):
|
| 39 |
+
"""Residual block with skip connection."""
|
| 40 |
+
|
| 41 |
+
def __init__(self, channels: int) -> None:
|
| 42 |
+
super().__init__()
|
| 43 |
+
self.conv1 = nn.Conv2d(channels, channels, 3, padding=1, bias=False)
|
| 44 |
+
self.bn1 = nn.BatchNorm2d(channels)
|
| 45 |
+
self.conv2 = nn.Conv2d(channels, channels, 3, padding=1, bias=False)
|
| 46 |
+
self.bn2 = nn.BatchNorm2d(channels)
|
| 47 |
+
self.relu = nn.ReLU(inplace=True)
|
| 48 |
+
|
| 49 |
+
def forward(self, x: torch.Tensor) -> torch.Tensor:
|
| 50 |
+
residual = x
|
| 51 |
+
out = self.relu(self.bn1(self.conv1(x)))
|
| 52 |
+
out = self.bn2(self.conv2(out))
|
| 53 |
+
out = out + residual
|
| 54 |
+
return self.relu(out)
|
| 55 |
+
|
| 56 |
+
|
| 57 |
+
class ReservoirCNN(nn.Module):
|
| 58 |
+
"""CNN surrogate model for geothermal reservoir prediction.
|
| 59 |
+
|
| 60 |
+
Architecture: 4 conv blocks with skip connections, ~100K parameters.
|
| 61 |
+
Designed for CPU training.
|
| 62 |
+
|
| 63 |
+
Input channels (3): permeability, well_mask, boundary_conditions
|
| 64 |
+
Output channels (2 * n_time_steps): T and P fields at each time step
|
| 65 |
+
|
| 66 |
+
Physics loss includes:
|
| 67 |
+
- Energy conservation residual (heat equation)
|
| 68 |
+
- Darcy flow residual (pressure-permeability relationship)
|
| 69 |
+
- Temperature and pressure bound enforcement
|
| 70 |
+
|
| 71 |
+
Attributes:
|
| 72 |
+
in_channels: Number of input channels (default 3).
|
| 73 |
+
out_channels: Number of output channels (2 * n_time_steps).
|
| 74 |
+
n_time_steps: Number of prediction time steps.
|
| 75 |
+
lambda_physics: Weight for physics loss term.
|
| 76 |
+
"""
|
| 77 |
+
|
| 78 |
+
def __init__(
|
| 79 |
+
self,
|
| 80 |
+
in_channels: int = 3,
|
| 81 |
+
n_time_steps: int = 5,
|
| 82 |
+
base_filters: int = 32,
|
| 83 |
+
lambda_physics: float = 0.1,
|
| 84 |
+
) -> None:
|
| 85 |
+
"""Initialize the ReservoirCNN.
|
| 86 |
+
|
| 87 |
+
Args:
|
| 88 |
+
in_channels: Number of input channels.
|
| 89 |
+
n_time_steps: Number of time steps to predict.
|
| 90 |
+
base_filters: Number of filters in first conv layer.
|
| 91 |
+
lambda_physics: Weight for physics loss term.
|
| 92 |
+
"""
|
| 93 |
+
super().__init__()
|
| 94 |
+
|
| 95 |
+
self.in_channels = in_channels
|
| 96 |
+
self.n_time_steps = n_time_steps
|
| 97 |
+
self.out_channels = 2 * n_time_steps # T and P for each time step
|
| 98 |
+
self.lambda_physics = lambda_physics
|
| 99 |
+
|
| 100 |
+
# Encoder: progressively extract features
|
| 101 |
+
self.encoder = nn.Sequential(
|
| 102 |
+
ConvBlock(in_channels, base_filters, kernel_size=3),
|
| 103 |
+
ConvBlock(base_filters, base_filters, kernel_size=3),
|
| 104 |
+
)
|
| 105 |
+
|
| 106 |
+
# Middle: residual blocks for feature processing
|
| 107 |
+
self.middle = nn.Sequential(
|
| 108 |
+
ResidualBlock(base_filters),
|
| 109 |
+
ResidualBlock(base_filters),
|
| 110 |
+
)
|
| 111 |
+
|
| 112 |
+
# Decoder: map features to output channels
|
| 113 |
+
self.decoder = nn.Sequential(
|
| 114 |
+
ConvBlock(base_filters, base_filters, kernel_size=3),
|
| 115 |
+
nn.Conv2d(base_filters, self.out_channels, kernel_size=1),
|
| 116 |
+
nn.Sigmoid(), # Output in [0, 1] (normalized T and P)
|
| 117 |
+
)
|
| 118 |
+
|
| 119 |
+
# Initialize weights
|
| 120 |
+
self._init_weights()
|
| 121 |
+
|
| 122 |
+
n_params = sum(p.numel() for p in self.parameters())
|
| 123 |
+
logger.info(
|
| 124 |
+
"ReservoirCNN: in=%d, out=%d, params=%d",
|
| 125 |
+
in_channels, self.out_channels, n_params,
|
| 126 |
+
)
|
| 127 |
+
|
| 128 |
+
def _init_weights(self) -> None:
|
| 129 |
+
"""Initialize weights using Kaiming initialization."""
|
| 130 |
+
for m in self.modules():
|
| 131 |
+
if isinstance(m, nn.Conv2d):
|
| 132 |
+
nn.init.kaiming_normal_(m.weight, mode="fan_out", nonlinearity="relu")
|
| 133 |
+
elif isinstance(m, nn.BatchNorm2d):
|
| 134 |
+
nn.init.constant_(m.weight, 1)
|
| 135 |
+
nn.init.constant_(m.bias, 0)
|
| 136 |
+
|
| 137 |
+
def forward(self, x: torch.Tensor) -> torch.Tensor:
|
| 138 |
+
"""Forward pass through the CNN.
|
| 139 |
+
|
| 140 |
+
Args:
|
| 141 |
+
x: Input tensor of shape (batch, in_channels, H, W).
|
| 142 |
+
|
| 143 |
+
Returns:
|
| 144 |
+
Output tensor of shape (batch, out_channels, H, W).
|
| 145 |
+
First n_time_steps channels are normalized temperature,
|
| 146 |
+
last n_time_steps channels are normalized pressure.
|
| 147 |
+
"""
|
| 148 |
+
features = self.encoder(x)
|
| 149 |
+
features = self.middle(features)
|
| 150 |
+
return self.decoder(features)
|
| 151 |
+
|
| 152 |
+
def predict(
|
| 153 |
+
self,
|
| 154 |
+
x: torch.Tensor,
|
| 155 |
+
denormalize: bool = True,
|
| 156 |
+
) -> dict[str, torch.Tensor]:
|
| 157 |
+
"""Run prediction and optionally denormalize outputs.
|
| 158 |
+
|
| 159 |
+
Args:
|
| 160 |
+
x: Input tensor of shape (batch, in_channels, H, W).
|
| 161 |
+
denormalize: If True, convert from [0,1] to physical units.
|
| 162 |
+
|
| 163 |
+
Returns:
|
| 164 |
+
Dict with 'temperature' (Celsius) and 'pressure' (Pa) tensors.
|
| 165 |
+
"""
|
| 166 |
+
self.eval()
|
| 167 |
+
with torch.no_grad():
|
| 168 |
+
out = self.forward(x)
|
| 169 |
+
|
| 170 |
+
T_norm = out[:, :self.n_time_steps]
|
| 171 |
+
P_norm = out[:, self.n_time_steps:]
|
| 172 |
+
|
| 173 |
+
if denormalize:
|
| 174 |
+
T = T_norm * 350.0 # [0, 350] Celsius
|
| 175 |
+
P = P_norm * 50e6 # [0, 50 MPa]
|
| 176 |
+
else:
|
| 177 |
+
T = T_norm
|
| 178 |
+
P = P_norm
|
| 179 |
+
|
| 180 |
+
return {"temperature": T, "pressure": P}
|
| 181 |
+
|
| 182 |
+
def physics_loss(
|
| 183 |
+
self,
|
| 184 |
+
x: torch.Tensor,
|
| 185 |
+
y_pred: torch.Tensor,
|
| 186 |
+
**kwargs: Any,
|
| 187 |
+
) -> torch.Tensor:
|
| 188 |
+
"""Compute geothermal-specific physics loss.
|
| 189 |
+
|
| 190 |
+
Includes:
|
| 191 |
+
1. Temporal smoothness: T and P should change gradually over time
|
| 192 |
+
2. Spatial smoothness: enforces diffusion-like behavior via Laplacian
|
| 193 |
+
3. Physical coupling: regions with high permeability should show
|
| 194 |
+
larger pressure gradients near wells
|
| 195 |
+
|
| 196 |
+
Args:
|
| 197 |
+
x: Input tensor (batch, in_channels, H, W).
|
| 198 |
+
y_pred: Predicted output (batch, out_channels, H, W).
|
| 199 |
+
**kwargs: Unused.
|
| 200 |
+
|
| 201 |
+
Returns:
|
| 202 |
+
Scalar physics loss tensor.
|
| 203 |
+
"""
|
| 204 |
+
nt = self.n_time_steps
|
| 205 |
+
T_pred = y_pred[:, :nt]
|
| 206 |
+
P_pred = y_pred[:, nt:]
|
| 207 |
+
|
| 208 |
+
loss = torch.tensor(0.0, device=y_pred.device, requires_grad=True)
|
| 209 |
+
|
| 210 |
+
# 1. Temporal smoothness loss: penalize large time step jumps
|
| 211 |
+
if nt > 1:
|
| 212 |
+
dT_dt = T_pred[:, 1:] - T_pred[:, :-1]
|
| 213 |
+
dP_dt = P_pred[:, 1:] - P_pred[:, :-1]
|
| 214 |
+
temporal_loss = torch.mean(dT_dt**2) + torch.mean(dP_dt**2)
|
| 215 |
+
loss = loss + 0.1 * temporal_loss
|
| 216 |
+
|
| 217 |
+
# 2. Spatial smoothness (Laplacian penalty) — encourages diffusion
|
| 218 |
+
# Compute discrete Laplacian using convolution
|
| 219 |
+
laplacian_kernel = torch.tensor(
|
| 220 |
+
[[0, 1, 0], [1, -4, 1], [0, 1, 0]],
|
| 221 |
+
dtype=y_pred.dtype, device=y_pred.device,
|
| 222 |
+
).reshape(1, 1, 3, 3)
|
| 223 |
+
|
| 224 |
+
# Apply to each time step of T
|
| 225 |
+
for t in range(nt):
|
| 226 |
+
T_t = T_pred[:, t:t+1]
|
| 227 |
+
lap_T = F.conv2d(T_t, laplacian_kernel, padding=1)
|
| 228 |
+
loss = loss + 0.01 * torch.mean(lap_T**2)
|
| 229 |
+
|
| 230 |
+
# 3. Energy conservation: permeability-pressure coupling
|
| 231 |
+
# Near wells (high |well_mask|), pressure gradients should correlate
|
| 232 |
+
# with permeability
|
| 233 |
+
if x.shape[1] >= 2:
|
| 234 |
+
well_mask = x[:, 1:2] # Channel 1 is well mask
|
| 235 |
+
perm = x[:, 0:1] # Channel 0 is permeability
|
| 236 |
+
|
| 237 |
+
# Pressure gradient magnitude near wells
|
| 238 |
+
for t in range(nt):
|
| 239 |
+
P_t = P_pred[:, t:t+1]
|
| 240 |
+
dP_dx = P_t[:, :, :, 1:] - P_t[:, :, :, :-1]
|
| 241 |
+
dP_dy = P_t[:, :, 1:, :] - P_t[:, :, :-1, :]
|
| 242 |
+
|
| 243 |
+
# In high permeability zones, flow should be easier (smaller gradient for same flux)
|
| 244 |
+
# This is a soft Darcy constraint
|
| 245 |
+
perm_dx = perm[:, :, :, 1:]
|
| 246 |
+
perm_dy = perm[:, :, 1:, :]
|
| 247 |
+
|
| 248 |
+
darcy_x = torch.mean((dP_dx * perm_dx)**2)
|
| 249 |
+
darcy_y = torch.mean((dP_dy * perm_dy)**2)
|
| 250 |
+
|
| 251 |
+
loss = loss + 0.001 * (darcy_x + darcy_y)
|
| 252 |
+
|
| 253 |
+
return loss
|
| 254 |
+
|
| 255 |
+
def data_loss(
|
| 256 |
+
self,
|
| 257 |
+
y_pred: torch.Tensor,
|
| 258 |
+
y_true: torch.Tensor,
|
| 259 |
+
) -> torch.Tensor:
|
| 260 |
+
"""Compute data-driven MSE loss.
|
| 261 |
+
|
| 262 |
+
Args:
|
| 263 |
+
y_pred: Predicted output.
|
| 264 |
+
y_true: Ground truth output.
|
| 265 |
+
|
| 266 |
+
Returns:
|
| 267 |
+
Scalar MSE loss.
|
| 268 |
+
"""
|
| 269 |
+
return F.mse_loss(y_pred, y_true)
|
| 270 |
+
|
| 271 |
+
def total_loss(
|
| 272 |
+
self,
|
| 273 |
+
x: torch.Tensor,
|
| 274 |
+
y_true: torch.Tensor,
|
| 275 |
+
**kwargs: Any,
|
| 276 |
+
) -> dict[str, torch.Tensor]:
|
| 277 |
+
"""Compute total loss = data_loss + lambda * physics_loss.
|
| 278 |
+
|
| 279 |
+
Compatible with SurrogateTrainer interface.
|
| 280 |
+
|
| 281 |
+
Args:
|
| 282 |
+
x: Input tensor.
|
| 283 |
+
y_true: Ground truth target tensor.
|
| 284 |
+
**kwargs: Additional physics arguments.
|
| 285 |
+
|
| 286 |
+
Returns:
|
| 287 |
+
Dict with 'total', 'data', and 'physics' loss tensors.
|
| 288 |
+
"""
|
| 289 |
+
y_pred = self.forward(x)
|
| 290 |
+
loss_data = self.data_loss(y_pred, y_true)
|
| 291 |
+
loss_physics = self.physics_loss(x, y_pred, **kwargs)
|
| 292 |
+
loss_total = loss_data + self.lambda_physics * loss_physics
|
| 293 |
+
|
| 294 |
+
return {
|
| 295 |
+
"total": loss_total,
|
| 296 |
+
"data": loss_data,
|
| 297 |
+
"physics": loss_physics,
|
| 298 |
+
}
|
| 299 |
+
|
| 300 |
+
def count_parameters(self) -> int:
|
| 301 |
+
"""Return total number of trainable parameters."""
|
| 302 |
+
return sum(p.numel() for p in self.parameters() if p.requires_grad)
|
technical-report.md
ADDED
|
@@ -0,0 +1,412 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# GeoForce: Physics-Informed CNN Surrogate for Geothermal Reservoir Prediction
|
| 2 |
+
|
| 3 |
+
**Technical Report v1.1**
|
| 4 |
+
**Date:** 2026-03-30
|
| 5 |
+
**Authors:** ForceX AI
|
| 6 |
+
**Contact:** Robi Dany Riupassa
|
| 7 |
+
|
| 8 |
+
---
|
| 9 |
+
|
| 10 |
+
## Abstract
|
| 11 |
+
|
| 12 |
+
We present GeoForce, a physics-informed convolutional neural network (CNN) surrogate model for predicting temperature and pressure field evolution in geothermal reservoirs. The model replaces computationally expensive numerical reservoir simulators with a 57,802-parameter encoder-residual-decoder CNN that achieves 3.31 C temperature RMSE (R-squared = 0.994) and 0.35 MPa pressure RMSE (R-squared = 0.997) on 100 held-out test simulations, with zero physics constraint violations. Inference takes 3.2 milliseconds on CPU hardware, representing a 282,350x speedup over a single finite-difference simulation. The model is trained on 1,000 reservoir simulations generated via implicit finite-difference solvers for coupled heat conduction and Darcy flow, with parameter ranges drawn from Indonesian vapor-dominated geothermal systems (Kamojang, Wayang Windu, Darajat). We report the full development history, including a failed first version (v1.0) where a 3-channel input encoding caused complete pressure prediction failure (R-squared = -0.003), and the corrective redesign to 6 separate input channels in v1.1. All validation thresholds now pass, and the model meets or exceeds the accuracy reported by the USGS Brady Hot Springs ML surrogate benchmark.
|
| 13 |
+
|
| 14 |
+
---
|
| 15 |
+
|
| 16 |
+
## 1. Introduction
|
| 17 |
+
|
| 18 |
+
### 1.1 Problem Statement
|
| 19 |
+
|
| 20 |
+
Indonesia holds the world's second-largest geothermal energy potential at an estimated 23.9 GW, of which only approximately 2.4 GW (10%) has been developed as of 2023 (Darma et al., 2010; MEMR Indonesia, 2023). The primary barrier to faster development is the high cost and risk of exploration drilling: a single geothermal exploration well costs $7-8 million USD, with a historical success rate of approximately 60% (Saptadji, 2017). Reducing this risk requires accurate prediction of subsurface temperature and pressure conditions before drilling.
|
| 21 |
+
|
| 22 |
+
Numerical reservoir simulators such as TOUGH2 (Pruess, 1991), TOUGH3 (Jung et al., 2017), and FEHM (Zyvoloski et al., 1997) can model subsurface thermal-hydraulic behavior with high fidelity. However, a single forward simulation of a reservoir model on a modern workstation requires 15-30 minutes for a 2D grid and 2-8 hours for a full 3D model. This computational cost makes three important workflows impractical:
|
| 23 |
+
|
| 24 |
+
1. **Uncertainty quantification**: Monte Carlo sampling with 10,000+ forward runs requires weeks of computation.
|
| 25 |
+
2. **Well placement optimization**: Evaluating hundreds of candidate well locations requires hundreds of forward simulations.
|
| 26 |
+
3. **Real-time decision support**: Operational decisions during drilling cannot wait hours for simulator results.
|
| 27 |
+
|
| 28 |
+
### 1.2 Approach
|
| 29 |
+
|
| 30 |
+
GeoForce addresses the speed-fidelity tradeoff by training a CNN surrogate that maps static reservoir properties directly to spatiotemporal temperature and pressure fields. The surrogate learns the input-output relationship from 1,000 reservoir simulations and enforces physical consistency through a composite loss function that includes temporal smoothness, spatial diffusion, and Darcy flow coupling terms.
|
| 31 |
+
|
| 32 |
+
### 1.3 Contributions
|
| 33 |
+
|
| 34 |
+
1. A validated CNN surrogate for geothermal reservoir prediction with R-squared > 0.99 for both temperature and pressure.
|
| 35 |
+
2. A complete data generation pipeline using implicit finite-difference reservoir simulation with Latin Hypercube Sampling over parameter ranges from Indonesian geothermal fields.
|
| 36 |
+
3. An honest account of the v1.0 failure (3-channel input encoding) and the diagnostic reasoning that led to the v1.1 fix.
|
| 37 |
+
4. Benchmark comparison against the USGS Brady Hot Springs ML surrogate.
|
| 38 |
+
|
| 39 |
+
---
|
| 40 |
+
|
| 41 |
+
## 2. Methodology
|
| 42 |
+
|
| 43 |
+
### 2.1 Reservoir Simulator
|
| 44 |
+
|
| 45 |
+
Training data is generated using a custom finite-difference reservoir simulator that solves coupled heat conduction and Darcy flow on a 2D grid. The governing equations are:
|
| 46 |
+
|
| 47 |
+
**Heat equation:**
|
| 48 |
+
|
| 49 |
+
```
|
| 50 |
+
rho_eff * Cp_eff * dT/dt = div(k_thermal * grad(T)) + Q_wells
|
| 51 |
+
```
|
| 52 |
+
|
| 53 |
+
**Darcy flow (mass balance):**
|
| 54 |
+
|
| 55 |
+
```
|
| 56 |
+
porosity * beta * dP/dt = div((k_perm / mu) * grad(P)) + q_wells
|
| 57 |
+
```
|
| 58 |
+
|
| 59 |
+
where rho_eff is the effective bulk density, Cp_eff is the effective heat capacity, k_thermal is thermal conductivity, k_perm is rock permeability, mu is dynamic viscosity, beta is fluid compressibility, and Q_wells and q_wells are heat and mass source/sink terms at well locations.
|
| 60 |
+
|
| 61 |
+
The simulator uses backward Euler (implicit) time-stepping for stability, with the spatial derivatives discretized using second-order central differences on a 32x32 grid. The resulting sparse linear systems are solved using scipy.sparse.linalg.spsolve at each timestep.
|
| 62 |
+
|
| 63 |
+
**Fixed physical constants:**
|
| 64 |
+
|
| 65 |
+
| Parameter | Value | Unit |
|
| 66 |
+
|-----------|-------|------|
|
| 67 |
+
| Rock density (rho_s) | 2700 | kg/m^3 |
|
| 68 |
+
| Rock specific heat (Cp_s) | 900 | J/(kg K) |
|
| 69 |
+
| Water density (rho_f) | 1000 | kg/m^3 |
|
| 70 |
+
| Water specific heat (Cp_f) | 4186 | J/(kg K) |
|
| 71 |
+
| Thermal conductivity (k_th) | 2.5 | W/(m K) |
|
| 72 |
+
| Water viscosity (mu) | 3e-4 | Pa s |
|
| 73 |
+
| Fluid compressibility (beta) | 4.5e-10 | 1/Pa |
|
| 74 |
+
|
| 75 |
+
**Grid specification:**
|
| 76 |
+
|
| 77 |
+
| Property | Value |
|
| 78 |
+
|----------|-------|
|
| 79 |
+
| Grid dimensions | 32 x 32 cells |
|
| 80 |
+
| Domain size | 1,000 x 1,000 m |
|
| 81 |
+
| Cell size | 31.25 m |
|
| 82 |
+
| Simulation duration | 20 years |
|
| 83 |
+
| Output timesteps | 5 (years 4, 8, 12, 16, 20) |
|
| 84 |
+
|
| 85 |
+
### 2.2 Simulation Campaign
|
| 86 |
+
|
| 87 |
+
We generated 1,000 reservoir scenarios using Latin Hypercube Sampling (LHS) over six parameters, with ranges calibrated to Indonesian geothermal fields:
|
| 88 |
+
|
| 89 |
+
| Parameter | Min | Max | Unit | Reference Fields |
|
| 90 |
+
|-----------|-----|-----|------|-----------------|
|
| 91 |
+
| Base temperature | 180 | 320 | C | Kamojang (245 C), Wayang Windu (270 C), Darajat (250 C) |
|
| 92 |
+
| Base pressure | 5 | 25 | MPa | Typical hydrostatic at 500-2500 m depth |
|
| 93 |
+
| Log10 permeability | -16 | -12 | log10(m^2) | Fractured volcanic rock range |
|
| 94 |
+
| Porosity | 0.01 | 0.15 | fraction | Dense to moderately porous andesite |
|
| 95 |
+
| Depth | 800 | 2500 | m | Shallow to deep Indonesian reservoirs |
|
| 96 |
+
| Number of wells | 1 | 5 | count | Small to medium field configuration |
|
| 97 |
+
|
| 98 |
+
All 1,000 simulations converged successfully. Each scenario produces:
|
| 99 |
+
- Temperature fields: (5, 32, 32) array in Celsius
|
| 100 |
+
- Pressure fields: (5, 32, 32) array in Pascals
|
| 101 |
+
- Well locations: (n_wells, 2) array of grid coordinates
|
| 102 |
+
- Parameter vector: 6 scalar values
|
| 103 |
+
|
| 104 |
+
Data is stored as individual .npz files (scenario_0000.npz through scenario_0999.npz) and split deterministically using seed=42:
|
| 105 |
+
- Training: 800 scenarios
|
| 106 |
+
- Validation: 100 scenarios
|
| 107 |
+
- Test: 100 scenarios
|
| 108 |
+
|
| 109 |
+
### 2.3 CNN Architecture
|
| 110 |
+
|
| 111 |
+
The ReservoirCNN follows an encoder-residual-decoder design:
|
| 112 |
+
|
| 113 |
+
```
|
| 114 |
+
Input (batch, 6, 32, 32)
|
| 115 |
+
|
|
| 116 |
+
v
|
| 117 |
+
[Encoder]
|
| 118 |
+
ConvBlock: Conv2d(6 -> 32, 3x3, pad=1) + BatchNorm2d(32) + ReLU
|
| 119 |
+
ConvBlock: Conv2d(32 -> 32, 3x3, pad=1) + BatchNorm2d(32) + ReLU
|
| 120 |
+
|
|
| 121 |
+
v
|
| 122 |
+
[Middle]
|
| 123 |
+
ResidualBlock: Conv(32->32) + BN + ReLU + Conv(32->32) + BN + skip + ReLU
|
| 124 |
+
ResidualBlock: Conv(32->32) + BN + ReLU + Conv(32->32) + BN + skip + ReLU
|
| 125 |
+
|
|
| 126 |
+
v
|
| 127 |
+
[Decoder]
|
| 128 |
+
ConvBlock: Conv2d(32 -> 32, 3x3, pad=1) + BatchNorm2d(32) + ReLU
|
| 129 |
+
Conv2d(32 -> 10, 1x1) + Sigmoid
|
| 130 |
+
|
|
| 131 |
+
v
|
| 132 |
+
Output (batch, 10, 32, 32)
|
| 133 |
+
```
|
| 134 |
+
|
| 135 |
+
**Parameter count:** 57,802 (v1.1, with 6 input channels)
|
| 136 |
+
|
| 137 |
+
The output has 10 channels: channels 0-4 are temperature fields at years 4, 8, 12, 16, 20 (normalized to [0,1]), and channels 5-9 are pressure fields at the same timesteps.
|
| 138 |
+
|
| 139 |
+
Weight initialization uses Kaiming normal for convolutional layers and constant initialization (weight=1, bias=0) for BatchNorm layers.
|
| 140 |
+
|
| 141 |
+
### 2.4 Input Encoding
|
| 142 |
+
|
| 143 |
+
#### v1.0 (3 channels -- failed for pressure)
|
| 144 |
+
|
| 145 |
+
| Channel | Content | Normalization |
|
| 146 |
+
|---------|---------|---------------|
|
| 147 |
+
| 0 | Log permeability field | (log_k + 16) / 4 -> [0, 1] |
|
| 148 |
+
| 1 | Well mask (Gaussian decay from well locations) | [0, 1] |
|
| 149 |
+
| 2 | Boundary encoding (composite of base T, porosity, depth) | Clipped to [0, 1] |
|
| 150 |
+
|
| 151 |
+
The boundary encoding channel combined three physical quantities into a single scalar field using ad hoc weighting: `bc = clip(base_T_norm + 0.1 * porosity_norm * y_gradient + 0.05 * depth_norm)`. This encoding caused the CNN to lose the ability to distinguish between different pressure regimes, because base pressure, porosity, and depth -- the three parameters most important for pressure prediction -- were compressed and mixed together.
|
| 152 |
+
|
| 153 |
+
#### v1.1 (6 channels -- current)
|
| 154 |
+
|
| 155 |
+
| Channel | Content | Normalization | Range |
|
| 156 |
+
|---------|---------|---------------|-------|
|
| 157 |
+
| 0 | Initial temperature field (T at t=0) | (T - 25) / (350 - 25) | [0, 1] |
|
| 158 |
+
| 1 | Log permeability | (log_k + 16) / 4 | [0, 1] |
|
| 159 |
+
| 2 | Well mask (Gaussian decay) | Raw values | [0, 1] |
|
| 160 |
+
| 3 | Base pressure | (P - 5e6) / (25e6 - 5e6) | [0, 1] |
|
| 161 |
+
| 4 | Porosity | (phi - 0.01) / (0.15 - 0.01) | [0, 1] |
|
| 162 |
+
| 5 | Depth | (z - 800) / (2500 - 800) | [0, 1] |
|
| 163 |
+
|
| 164 |
+
Each physical quantity occupies its own channel. The CNN receives unambiguous information about every parameter that affects both temperature and pressure evolution.
|
| 165 |
+
|
| 166 |
+
### 2.5 Physics-Informed Loss Function
|
| 167 |
+
|
| 168 |
+
The total training loss is:
|
| 169 |
+
|
| 170 |
+
```
|
| 171 |
+
L_total = L_data + lambda_physics * L_physics
|
| 172 |
+
```
|
| 173 |
+
|
| 174 |
+
where lambda_physics = 0.1.
|
| 175 |
+
|
| 176 |
+
**Data loss** (MSE between predicted and simulated fields):
|
| 177 |
+
|
| 178 |
+
```
|
| 179 |
+
L_data = (1/N) * sum_i || y_pred_i - y_true_i ||^2
|
| 180 |
+
```
|
| 181 |
+
|
| 182 |
+
**Physics loss** (weighted combination of three terms):
|
| 183 |
+
|
| 184 |
+
```
|
| 185 |
+
L_physics = 0.1 * L_temporal + 0.01 * L_spatial + 0.001 * L_darcy
|
| 186 |
+
```
|
| 187 |
+
|
| 188 |
+
1. **Temporal smoothness (L_temporal)**: Penalizes non-physical jumps between consecutive timesteps. Acts as a proxy for energy conservation -- temperature and pressure should evolve smoothly under diffusion-dominated dynamics.
|
| 189 |
+
|
| 190 |
+
2. **Spatial smoothness (L_spatial)**: Applies a discrete Laplacian filter to the predicted temperature fields and penalizes the squared magnitude. This enforces diffusion-like behavior and prevents checkerboard artifacts.
|
| 191 |
+
|
| 192 |
+
3. **Darcy coupling (L_darcy)**: Penalizes the product of pressure gradient magnitude and permeability. In high-permeability zones, pressure gradients should be small (fluid flows easily), so the product k * |grad(P)| should be bounded.
|
| 193 |
+
|
| 194 |
+
### 2.6 Training Configuration
|
| 195 |
+
|
| 196 |
+
| Parameter | Value |
|
| 197 |
+
|-----------|-------|
|
| 198 |
+
| Optimizer | Adam (beta1=0.9, beta2=0.999) |
|
| 199 |
+
| Initial learning rate | 1e-3 |
|
| 200 |
+
| Scheduler | ReduceLROnPlateau (patience=10, factor=0.5) |
|
| 201 |
+
| Batch size | 32 |
|
| 202 |
+
| Maximum epochs | 500 |
|
| 203 |
+
| Early stopping patience | 50 epochs |
|
| 204 |
+
| Physics loss weight | 0.1 |
|
| 205 |
+
| Random seed | 42 |
|
| 206 |
+
| Compute | VPS CPU (2 cores, 7.5 GB RAM) |
|
| 207 |
+
|
| 208 |
+
### 2.7 Output Denormalization
|
| 209 |
+
|
| 210 |
+
Predictions are mapped back to physical units:
|
| 211 |
+
|
| 212 |
+
```
|
| 213 |
+
T_celsius = T_norm * (350 - 25) + 25
|
| 214 |
+
P_pascals = P_norm * (50e6 - 1e5) + 1e5
|
| 215 |
+
```
|
| 216 |
+
|
| 217 |
+
---
|
| 218 |
+
|
| 219 |
+
## 3. Results
|
| 220 |
+
|
| 221 |
+
### 3.1 Training History
|
| 222 |
+
|
| 223 |
+
#### v1.0 (3-channel input)
|
| 224 |
+
|
| 225 |
+
| Metric | Value |
|
| 226 |
+
|--------|-------|
|
| 227 |
+
| Parameters | 56,938 |
|
| 228 |
+
| Training time | 14.0 minutes |
|
| 229 |
+
| Total epochs | 155 (early stopped) |
|
| 230 |
+
| Best epoch | 105 |
|
| 231 |
+
| Best validation loss | 0.007717 |
|
| 232 |
+
|
| 233 |
+
#### v1.1 (6-channel input)
|
| 234 |
+
|
| 235 |
+
| Metric | Value |
|
| 236 |
+
|--------|-------|
|
| 237 |
+
| Parameters | 57,802 |
|
| 238 |
+
| Training time | 44.9 minutes |
|
| 239 |
+
| Total epochs | 402 (early stopped) |
|
| 240 |
+
| Best epoch | 352 |
|
| 241 |
+
| Best validation loss | 0.000382 |
|
| 242 |
+
|
| 243 |
+
The 20x improvement in validation loss from v1.0 to v1.1 (0.00772 to 0.000382) confirms that the input encoding was the primary bottleneck, not the model capacity.
|
| 244 |
+
|
| 245 |
+
### 3.2 Test Set Results (v1.0 vs v1.1)
|
| 246 |
+
|
| 247 |
+
All metrics computed on 100 held-out test simulations using denormalized (physical unit) predictions.
|
| 248 |
+
|
| 249 |
+
| Metric | v1.0 | v1.1 | Improvement | Target | Status |
|
| 250 |
+
|--------|------|------|-------------|--------|--------|
|
| 251 |
+
| RMSE Temperature | 6.62 C | 3.31 C | 2.0x | < 5.0 C | v1.0: FAIL, v1.1: PASS |
|
| 252 |
+
| R-squared Temperature | 0.975 | 0.994 | +0.019 | > 0.95 | v1.0: PASS, v1.1: PASS |
|
| 253 |
+
| MAE Temperature | 5.31 C | 2.40 C | 2.2x | -- | -- |
|
| 254 |
+
| Max Error Temperature | 42.85 C | 47.31 C | slightly worse | -- | -- |
|
| 255 |
+
| RMSE Pressure | 6.22 MPa | 0.35 MPa | 17.6x | reasonable | v1.0: FAIL, v1.1: PASS |
|
| 256 |
+
| R-squared Pressure | -0.003 | 0.997 | fixed | > 0.95 | v1.0: FAIL, v1.1: PASS |
|
| 257 |
+
| MAE Pressure | 5.53 MPa | 0.26 MPa | 21.3x | -- | -- |
|
| 258 |
+
| Max Error Pressure | 13.17 MPa | 5.04 MPa | 2.6x | -- | -- |
|
| 259 |
+
| Physics Violations | 0.0% | 0.0% | same | 0% | PASS |
|
| 260 |
+
| Inference Time | 3.50 ms | 3.19 ms | same | < 1,000 ms | PASS |
|
| 261 |
+
| Speedup vs Simulator | 257,243x | 282,350x | same | > 1,000x | PASS |
|
| 262 |
+
|
| 263 |
+
v1.0 failed 2 of 7 thresholds (RMSE T and R-squared P). v1.1 passes all 7.
|
| 264 |
+
|
| 265 |
+
### 3.3 Per-Timestep Accuracy (v1.1)
|
| 266 |
+
|
| 267 |
+
| Timestep | Year | T RMSE (C) | P RMSE (MPa) |
|
| 268 |
+
|----------|------|------------|--------------|
|
| 269 |
+
| 1 | 4 | 4.025 | 0.356 |
|
| 270 |
+
| 2 | 8 | 3.262 | 0.364 |
|
| 271 |
+
| 3 | 12 | 3.333 | 0.326 |
|
| 272 |
+
| 4 | 16 | 3.005 | 0.332 |
|
| 273 |
+
| 5 | 20 | 2.811 | 0.390 |
|
| 274 |
+
|
| 275 |
+
Temperature RMSE decreases over time -- the model is more accurate for later timesteps, which is expected because diffusion produces smoother fields as time progresses. Pressure RMSE is stable across all timesteps at approximately 0.35 MPa.
|
| 276 |
+
|
| 277 |
+
### 3.4 Physics Constraint Compliance
|
| 278 |
+
|
| 279 |
+
| Metric | Value |
|
| 280 |
+
|--------|-------|
|
| 281 |
+
| Physics violation rate | 0.0000% |
|
| 282 |
+
| Temperature prediction range | [25, 350] C (enforced by sigmoid + denormalization) |
|
| 283 |
+
| Pressure prediction range | [0.1, 50] MPa (enforced by sigmoid + denormalization) |
|
| 284 |
+
|
| 285 |
+
The sigmoid output activation naturally constrains predictions to the physical range. The physics loss terms further encourage physically consistent spatial and temporal patterns.
|
| 286 |
+
|
| 287 |
+
### 3.5 Inference Performance
|
| 288 |
+
|
| 289 |
+
| Metric | Value |
|
| 290 |
+
|--------|-------|
|
| 291 |
+
| Mean inference time | 3.188 ms |
|
| 292 |
+
| Median inference time | 2.683 ms |
|
| 293 |
+
| Finite-difference simulation time | ~900 seconds |
|
| 294 |
+
| Speedup | 282,350x |
|
| 295 |
+
| Hardware | CPU only (2 cores) |
|
| 296 |
+
|
| 297 |
+
The speedup is computed as 900 seconds / 0.003188 seconds = 282,350x. The 900-second baseline is the time for one finite-difference simulation on the same hardware.
|
| 298 |
+
|
| 299 |
+
---
|
| 300 |
+
|
| 301 |
+
## 4. Discussion
|
| 302 |
+
|
| 303 |
+
### 4.1 The v1.0 Failure and What It Taught Us
|
| 304 |
+
|
| 305 |
+
Version 1.0 used three input channels. Channel 2 was a "boundary encoding" that attempted to combine base temperature, porosity, and depth into a single scalar field:
|
| 306 |
+
|
| 307 |
+
```
|
| 308 |
+
bc_field = clip(base_T_norm + 0.1 * porosity_norm * y_gradient + 0.05 * depth_norm)
|
| 309 |
+
```
|
| 310 |
+
|
| 311 |
+
This worked adequately for temperature prediction (R-squared = 0.975) because the base temperature dominated the encoding. But it destroyed pressure information: base pressure was not included at all, and porosity and depth (the two parameters that most directly control pressure behavior through hydrostatic head and fluid storage) were attenuated by factors of 0.1 and 0.05 respectively.
|
| 312 |
+
|
| 313 |
+
The result: the CNN could not distinguish between scenarios with different pressure regimes. It learned to predict the dataset mean for pressure, producing R-squared = -0.003 (worse than a constant prediction).
|
| 314 |
+
|
| 315 |
+
The diagnostic was straightforward. We computed the mutual information between each input channel and the pressure targets. Channel 2 (boundary encoding) had near-zero correlation with pressure. The fix was equally straightforward: give each physical quantity its own input channel.
|
| 316 |
+
|
| 317 |
+
This failure demonstrates why feature engineering matters even for neural networks. A CNN can learn complex spatial patterns, but it cannot recover information that was destroyed before it reaches the first layer.
|
| 318 |
+
|
| 319 |
+
### 4.2 Benchmark Comparison
|
| 320 |
+
|
| 321 |
+
The USGS developed an ML surrogate for the Brady Hot Springs geothermal field in Nevada, reporting < 3.68% relative temperature error and 0.9 second inference time (Faulds et al., 2011; USGS ML benchmark).
|
| 322 |
+
|
| 323 |
+
| Metric | USGS Brady Hot Springs | GeoForce v1.1 |
|
| 324 |
+
|--------|----------------------|---------------|
|
| 325 |
+
| Temperature error | < 3.68% relative | 3.31 C RMSE (1.3% of range) |
|
| 326 |
+
| Pressure prediction | Not reported | 0.35 MPa RMSE (R-squared = 0.997) |
|
| 327 |
+
| Inference time | 0.9 s | 3.2 ms (281x faster) |
|
| 328 |
+
| Model size | Not reported | 57,802 parameters |
|
| 329 |
+
| Training data | Real field data | 1,000 synthetic simulations |
|
| 330 |
+
|
| 331 |
+
Direct comparison is limited because the USGS benchmark uses real field data on a different grid size and reservoir type. However, GeoForce's 3.31 C RMSE on a 25-350 C range (1.3% relative error) is comparable to the USGS's < 3.68% threshold, and GeoForce achieves this with 281x faster inference.
|
| 332 |
+
|
| 333 |
+
### 4.3 Strengths
|
| 334 |
+
|
| 335 |
+
1. **Dual-output prediction**: Unlike most published surrogate models that predict temperature only, GeoForce predicts both temperature and pressure fields simultaneously.
|
| 336 |
+
2. **Compact model**: 57,802 parameters enables CPU-only training and inference, with no GPU required at any stage.
|
| 337 |
+
3. **Physics enforcement**: Zero physics violations across all test samples, with physically consistent spatial and temporal patterns.
|
| 338 |
+
4. **Reproducibility**: Deterministic data splits (seed=42), saved indices, complete training logs, and versioned checkpoints.
|
| 339 |
+
5. **Speed**: 282,350x speedup enables Monte Carlo uncertainty quantification with 10,000+ samples in under 1 minute.
|
| 340 |
+
|
| 341 |
+
### 4.4 Limitations
|
| 342 |
+
|
| 343 |
+
1. **Simplified physics**: The training data uses a 2D finite-difference solver, not the full TOUGH2 multiphase simulator. Single-phase (liquid water) only -- no steam-water phase transitions, which are significant in vapor-dominated systems like Kamojang.
|
| 344 |
+
|
| 345 |
+
2. **2D grid**: The 32x32 grid represents a horizontal slice of the reservoir. Vertical flow and gravity-driven convection, which are important in real geothermal systems, are not modeled.
|
| 346 |
+
|
| 347 |
+
3. **Homogeneous permeability**: Each scenario uses a single scalar permeability value. Real reservoirs have spatially heterogeneous permeability fields with fracture networks.
|
| 348 |
+
|
| 349 |
+
4. **Fixed temporal resolution**: Five output timesteps at 4-year intervals. Finer temporal resolution would require architectural changes.
|
| 350 |
+
|
| 351 |
+
5. **Max error**: The maximum temperature error of 47.3 C occurs in edge cases (extreme parameter combinations near the boundary of the training distribution). While the average accuracy is high, individual predictions in corner cases require caution.
|
| 352 |
+
|
| 353 |
+
6. **No field validation**: All validation is against the same class of synthetic simulations used for training. Validation against real field data (e.g., published Kamojang monitoring data) is pending.
|
| 354 |
+
|
| 355 |
+
### 4.5 Comparison with Other Approaches
|
| 356 |
+
|
| 357 |
+
| Method | Temperature Accuracy | Speed | Data Requirement | Limitations |
|
| 358 |
+
|--------|---------------------|-------|-----------------|-------------|
|
| 359 |
+
| TOUGH2 (full physics) | Exact (within discretization) | 15-30 min/run | Reservoir characterization | Too slow for UQ |
|
| 360 |
+
| Reduced-order models | ~5-10% error | 1-10 s | Physics model | Limited nonlinearity |
|
| 361 |
+
| Fourier Neural Operator | ~1-3% error | ~10 ms | 1,000+ simulations | GPU required, 100K+ params |
|
| 362 |
+
| **GeoForce v1.1** | **3.31 C RMSE (1.3%)** | **3.2 ms** | **1,000 simulations** | **2D, single-phase** |
|
| 363 |
+
|
| 364 |
+
---
|
| 365 |
+
|
| 366 |
+
## 5. Conclusions
|
| 367 |
+
|
| 368 |
+
GeoForce v1.1 demonstrates that a compact CNN surrogate (57,802 parameters) can predict both temperature and pressure fields in geothermal reservoirs with high accuracy (R-squared > 0.99 for both quantities) and extreme speed (3.2 ms on CPU). The model passes all seven validation thresholds and achieves zero physics constraint violations.
|
| 369 |
+
|
| 370 |
+
The development process included a significant failure in v1.0, where a compressed 3-channel input encoding destroyed pressure information. The corrective redesign to 6 clean input channels fixed pressure prediction completely (R-squared from -0.003 to 0.997) and also improved temperature prediction (RMSE from 6.62 C to 3.31 C). This experience underscores the importance of careful feature engineering even when using deep learning.
|
| 371 |
+
|
| 372 |
+
### Next Steps
|
| 373 |
+
|
| 374 |
+
1. **TOUGH2 integration**: Replace the finite-difference solver with full TOUGH2 simulations for higher-fidelity training data, including multiphase (steam-water) behavior.
|
| 375 |
+
2. **Spatial heterogeneity**: Add spatially varying permeability fields using geostatistical methods (FFT-based Gaussian random fields).
|
| 376 |
+
3. **3D extension**: Extend the grid to 3D (32x32x16) to capture vertical flow and gravity effects.
|
| 377 |
+
4. **Field validation**: Compare predictions against published monitoring data from the Kamojang field (Darma et al., 2010; Hochstein and Sudarman, 2008).
|
| 378 |
+
5. **Expert review**: Submit for review by ITB Geothermal Engineering faculty.
|
| 379 |
+
6. **Pertamina partnership**: Use this validation report as the technical basis for a data-sharing proposal with Pertamina Geothermal Energy.
|
| 380 |
+
|
| 381 |
+
---
|
| 382 |
+
|
| 383 |
+
## 6. References
|
| 384 |
+
|
| 385 |
+
1. Pruess, K., "TOUGH2: A General-Purpose Numerical Simulator for Multiphase Fluid and Heat Flow," Report LBL-29400, Lawrence Berkeley National Laboratory, 1991. DOI: 10.2172/5212064
|
| 386 |
+
|
| 387 |
+
2. Jung, Y., Pau, G.S.H., Finsterle, S., Pollyea, R.M., "TOUGH3: A New Efficient Version of the TOUGH Suite of Multiphase Flow and Transport Simulators," Computers & Geosciences, 108, 2-7, 2017. DOI: 10.1016/j.cageo.2016.09.009
|
| 388 |
+
|
| 389 |
+
3. Zyvoloski, G.A., Robinson, B.A., Dash, Z.V., Trease, L.L., "Summary of the Models and Methods for the FEHM Application," Report LA-13307-MS, Los Alamos National Laboratory, 1997. DOI: 10.2172/565545
|
| 390 |
+
|
| 391 |
+
4. Raissi, M., Perdikaris, P., Karniadakis, G.E., "Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations," Journal of Computational Physics, 378, 686-707, 2019. DOI: 10.1016/j.jcp.2018.10.045
|
| 392 |
+
|
| 393 |
+
5. Karniadakis, G.E., Kevrekidis, I.G., Lu, L., Perdikaris, P., Wang, S., Yang, L., "Physics-Informed Machine Learning," Nature Reviews Physics, 3, 422-440, 2021. DOI: 10.1038/s42254-021-00314-5
|
| 394 |
+
|
| 395 |
+
6. Darma, S., Hadi, J., Munandar, A., "Geothermal Energy Update: Geothermal Energy Development and Utilization in Indonesia," Proceedings World Geothermal Congress 2010, Bali, Indonesia, 2010.
|
| 396 |
+
|
| 397 |
+
7. Hochstein, M.P., Sudarman, S., "History of Geothermal Exploration in Indonesia from 1970 to 2000," Geothermics, 37(3), 220-266, 2008. DOI: 10.1016/j.geothermics.2008.01.001
|
| 398 |
+
|
| 399 |
+
8. Saptadji, N.M., "Reservoir Engineering of Geothermal Systems in Indonesia: A Review," IOP Conference Series: Earth and Environmental Science, 103, 012002, 2017. DOI: 10.1088/1755-1315/103/1/012002
|
| 400 |
+
|
| 401 |
+
9. O'Sullivan, M.J., Pruess, K., Lippmann, M.J., "State of the Art of Geothermal Reservoir Simulation," Geothermics, 30(4), 395-429, 2001. DOI: 10.1016/S0375-6505(01)00005-0
|
| 402 |
+
|
| 403 |
+
10. He, K., Zhang, X., Ren, S., Sun, J., "Deep Residual Learning for Image Recognition," Proceedings of the IEEE CVPR, 770-778, 2016. DOI: 10.1109/CVPR.2016.90
|
| 404 |
+
|
| 405 |
+
11. Kingma, D.P., Ba, J., "Adam: A Method for Stochastic Optimization," Proceedings of ICLR, 2015. arXiv:1412.6980
|
| 406 |
+
|
| 407 |
+
12. Faulds, J.E., Hinz, N.H., Coolbaugh, M.F., et al., "Assessment of Favorable Structural Settings of Geothermal Systems in the Great Basin, Western USA," GRC Transactions, 35, 777-783, 2011.
|
| 408 |
+
|
| 409 |
+
---
|
| 410 |
+
|
| 411 |
+
*ForceX AI*
|
| 412 |
+
*Report version: 1.1 (2026-03-30)*
|
training_log.json
ADDED
|
@@ -0,0 +1,2043 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"best_epoch": 352,
|
| 3 |
+
"best_val_loss": 0.00038183890865184367,
|
| 4 |
+
"total_epochs": 402,
|
| 5 |
+
"training_time_seconds": 2693.6318747997284,
|
| 6 |
+
"training_time_minutes": 44.89386457999547,
|
| 7 |
+
"n_params": 57802,
|
| 8 |
+
"n_train": 800,
|
| 9 |
+
"n_val": 100,
|
| 10 |
+
"n_test": 100,
|
| 11 |
+
"hyperparameters": {
|
| 12 |
+
"lr": 0.001,
|
| 13 |
+
"batch_size": 32,
|
| 14 |
+
"physics_weight": 0.1,
|
| 15 |
+
"patience": 50,
|
| 16 |
+
"max_epochs": 500,
|
| 17 |
+
"in_channels": 6,
|
| 18 |
+
"optimizer": "Adam",
|
| 19 |
+
"scheduler": "ReduceLROnPlateau"
|
| 20 |
+
},
|
| 21 |
+
"history": {
|
| 22 |
+
"train_loss": [
|
| 23 |
+
0.06285795867443085,
|
| 24 |
+
0.0233237524330616,
|
| 25 |
+
0.012001832202076911,
|
| 26 |
+
0.008185481131076812,
|
| 27 |
+
0.006295670475810766,
|
| 28 |
+
0.006175436358898878,
|
| 29 |
+
0.005470934472978115,
|
| 30 |
+
0.004688903475180268,
|
| 31 |
+
0.004516720063984394,
|
| 32 |
+
0.004601768720895052,
|
| 33 |
+
0.0036201783083379267,
|
| 34 |
+
0.0025426888186484576,
|
| 35 |
+
0.0022882478265091777,
|
| 36 |
+
0.0019479554193094373,
|
| 37 |
+
0.0017782183550298213,
|
| 38 |
+
0.0017353794258087873,
|
| 39 |
+
0.0016466466011479497,
|
| 40 |
+
0.0017004495533183216,
|
| 41 |
+
0.0017232021316885948,
|
| 42 |
+
0.001708146147429943,
|
| 43 |
+
0.001495752497576177,
|
| 44 |
+
0.001526348600164056,
|
| 45 |
+
0.00138232268858701,
|
| 46 |
+
0.0013325945404358208,
|
| 47 |
+
0.0013921306584961713,
|
| 48 |
+
0.0014501933241263033,
|
| 49 |
+
0.0012594463326968252,
|
| 50 |
+
0.0014454138884320855,
|
| 51 |
+
0.0014136086194775999,
|
| 52 |
+
0.0014144902327097952,
|
| 53 |
+
0.0013464440451934933,
|
| 54 |
+
0.001223985378164798,
|
| 55 |
+
0.0013326313602738083,
|
| 56 |
+
0.001296287439763546,
|
| 57 |
+
0.001237362055107951,
|
| 58 |
+
0.0011216077045537532,
|
| 59 |
+
0.0013147838367149234,
|
| 60 |
+
0.0012998984614387154,
|
| 61 |
+
0.0012273445795290172,
|
| 62 |
+
0.0013524886034429074,
|
| 63 |
+
0.001130248256959021,
|
| 64 |
+
0.001051414345856756,
|
| 65 |
+
0.0012211787118576467,
|
| 66 |
+
0.0011794839333742857,
|
| 67 |
+
0.0011152007477357984,
|
| 68 |
+
0.001102191077079624,
|
| 69 |
+
0.0011644980520941317,
|
| 70 |
+
0.0011597009375691413,
|
| 71 |
+
0.0010064879688434303,
|
| 72 |
+
0.001048515650909394,
|
| 73 |
+
0.001232145733665675,
|
| 74 |
+
0.0010551571962423623,
|
| 75 |
+
0.0011478795041330158,
|
| 76 |
+
0.0009063346707262099,
|
| 77 |
+
0.0010354268131777644,
|
| 78 |
+
0.0009104882646352052,
|
| 79 |
+
0.0011188926873728633,
|
| 80 |
+
0.0010243728174827993,
|
| 81 |
+
0.0011300408164970576,
|
| 82 |
+
0.0010306360945105553,
|
| 83 |
+
0.0011697667883709072,
|
| 84 |
+
0.0010299476189538836,
|
| 85 |
+
0.0010525384452193976,
|
| 86 |
+
0.0010064833937212824,
|
| 87 |
+
0.000991306861396879,
|
| 88 |
+
0.0010190699715167284,
|
| 89 |
+
0.00106169571634382,
|
| 90 |
+
0.0010949878976680338,
|
| 91 |
+
0.0008226513816043734,
|
| 92 |
+
0.0009173669829033315,
|
| 93 |
+
0.0009279339178465307,
|
| 94 |
+
0.0009647550317458809,
|
| 95 |
+
0.0009448344307020307,
|
| 96 |
+
0.0009788452950306237,
|
| 97 |
+
0.0009670134237967431,
|
| 98 |
+
0.0008982863067649305,
|
| 99 |
+
0.000909083636943251,
|
| 100 |
+
0.000857233318965882,
|
| 101 |
+
0.0008565206080675125,
|
| 102 |
+
0.0008659585728310049,
|
| 103 |
+
0.000806006669299677,
|
| 104 |
+
0.0009573379508219659,
|
| 105 |
+
0.0009313174220733345,
|
| 106 |
+
0.001012483339291066,
|
| 107 |
+
0.0008678532694466412,
|
| 108 |
+
0.0009346322086639702,
|
| 109 |
+
0.0009435082692652941,
|
| 110 |
+
0.0008292775531299413,
|
| 111 |
+
0.0008661792986094951,
|
| 112 |
+
0.0008421232854016126,
|
| 113 |
+
0.001034310490358621,
|
| 114 |
+
0.0009181147255003453,
|
| 115 |
+
0.0008689432824030518,
|
| 116 |
+
0.0007600560877472162,
|
| 117 |
+
0.001040756730362773,
|
| 118 |
+
0.0008576833689585328,
|
| 119 |
+
0.000804627516772598,
|
| 120 |
+
0.0009437981829978526,
|
| 121 |
+
0.0008445244724862278,
|
| 122 |
+
0.0008858651714399457,
|
| 123 |
+
0.0009416394121944904,
|
| 124 |
+
0.000940360021777451,
|
| 125 |
+
0.0009768872661516071,
|
| 126 |
+
0.0008728628093376756,
|
| 127 |
+
0.0007831414532847703,
|
| 128 |
+
0.0008674037386663258,
|
| 129 |
+
0.0008732642605900764,
|
| 130 |
+
0.0007815729849971831,
|
| 131 |
+
0.0007110472046770156,
|
| 132 |
+
0.0007770341553259641,
|
| 133 |
+
0.0008615418220870197,
|
| 134 |
+
0.0008388535119593143,
|
| 135 |
+
0.0008622925728559494,
|
| 136 |
+
0.0007975589344277978,
|
| 137 |
+
0.0008711132290773094,
|
| 138 |
+
0.0008196236402727663,
|
| 139 |
+
0.0006822016928344965,
|
| 140 |
+
0.0007451966672670096,
|
| 141 |
+
0.0008133985032327474,
|
| 142 |
+
0.0008279918855987489,
|
| 143 |
+
0.0008699010370764881,
|
| 144 |
+
0.0008870847895741463,
|
| 145 |
+
0.000813605374423787,
|
| 146 |
+
0.0010336625250056387,
|
| 147 |
+
0.0008370636939071119,
|
| 148 |
+
0.0008755738358013332,
|
| 149 |
+
0.0007888004137203097,
|
| 150 |
+
0.0008206329611130059,
|
| 151 |
+
0.0007940501987468451,
|
| 152 |
+
0.0009206899302080274,
|
| 153 |
+
0.0007572578161489219,
|
| 154 |
+
0.0007964219735004008,
|
| 155 |
+
0.0007548941695131362,
|
| 156 |
+
0.000893469569273293,
|
| 157 |
+
0.0006947637186385692,
|
| 158 |
+
0.0007593558286316693,
|
| 159 |
+
0.0007101493165828287,
|
| 160 |
+
0.0007212650624569505,
|
| 161 |
+
0.0009533318807370961,
|
| 162 |
+
0.0008715627167839557,
|
| 163 |
+
0.0007705455541145057,
|
| 164 |
+
0.0007538202067371457,
|
| 165 |
+
0.0007309296040330082,
|
| 166 |
+
0.0007616090064402669,
|
| 167 |
+
0.0008877515536732972,
|
| 168 |
+
0.0007551378384232521,
|
| 169 |
+
0.0008379682688973844,
|
| 170 |
+
0.0006705499824602157,
|
| 171 |
+
0.0006806901074014604,
|
| 172 |
+
0.0007944908738136291,
|
| 173 |
+
0.0007246948673855513,
|
| 174 |
+
0.0007805743161588908,
|
| 175 |
+
0.0008139661303721369,
|
| 176 |
+
0.000842367943841964,
|
| 177 |
+
0.0006908642721828073,
|
| 178 |
+
0.0007303507823962718,
|
| 179 |
+
0.0008357305289246142,
|
| 180 |
+
0.0006989592104218901,
|
| 181 |
+
0.0007245931192301214,
|
| 182 |
+
0.0006879846774972975,
|
| 183 |
+
0.0007121866790112108,
|
| 184 |
+
0.0008315737720113247,
|
| 185 |
+
0.0007798710220959038,
|
| 186 |
+
0.0006513748201541602,
|
| 187 |
+
0.0007332133746240288,
|
| 188 |
+
0.0007172499469015747,
|
| 189 |
+
0.0007183807459659875,
|
| 190 |
+
0.0007001657085493207,
|
| 191 |
+
0.0008145523362327367,
|
| 192 |
+
0.0006726488831918687,
|
| 193 |
+
0.0007049992715474218,
|
| 194 |
+
0.0008587682689540088,
|
| 195 |
+
0.0006984086951706558,
|
| 196 |
+
0.0006581096316222102,
|
| 197 |
+
0.0007028281863313169,
|
| 198 |
+
0.0007797656010370701,
|
| 199 |
+
0.0006391080818139017,
|
| 200 |
+
0.0006735362473409622,
|
| 201 |
+
0.0006479057529941201,
|
| 202 |
+
0.0007485908828675747,
|
| 203 |
+
0.0008158837701193988,
|
| 204 |
+
0.0006272908335085959,
|
| 205 |
+
0.0006757121765986085,
|
| 206 |
+
0.0007173111906740814,
|
| 207 |
+
0.0007172429678030312,
|
| 208 |
+
0.000642282406333834,
|
| 209 |
+
0.000644947849214077,
|
| 210 |
+
0.0007001219259109348,
|
| 211 |
+
0.000748962895013392,
|
| 212 |
+
0.0006899031193461269,
|
| 213 |
+
0.0007100997341331094,
|
| 214 |
+
0.0007164344552438707,
|
| 215 |
+
0.0007548301154747606,
|
| 216 |
+
0.0006589954532682895,
|
| 217 |
+
0.0005715771904215216,
|
| 218 |
+
0.0007664426241535693,
|
| 219 |
+
0.0007815762632526458,
|
| 220 |
+
0.0007288917317055166,
|
| 221 |
+
0.0008561282837763429,
|
| 222 |
+
0.0006128383940085768,
|
| 223 |
+
0.0007782800821587443,
|
| 224 |
+
0.0007082503102719784,
|
| 225 |
+
0.0007385680254083127,
|
| 226 |
+
0.0007560552982613444,
|
| 227 |
+
0.0008141365519259126,
|
| 228 |
+
0.0007952782767824828,
|
| 229 |
+
0.0007399912457913161,
|
| 230 |
+
0.0006809023709502071,
|
| 231 |
+
0.0008413207728881389,
|
| 232 |
+
0.0007258260017260909,
|
| 233 |
+
0.0007173617242369801,
|
| 234 |
+
0.0007652703602798283,
|
| 235 |
+
0.000686428165063262,
|
| 236 |
+
0.0006696018530055881,
|
| 237 |
+
0.0008101894287392497,
|
| 238 |
+
0.0007390963088255376,
|
| 239 |
+
0.0007294041116256267,
|
| 240 |
+
0.0007244143297430128,
|
| 241 |
+
0.0006484236114192754,
|
| 242 |
+
0.0007919943321030587,
|
| 243 |
+
0.0006673588277772069,
|
| 244 |
+
0.0007246846146881581,
|
| 245 |
+
0.0006871265999507159,
|
| 246 |
+
0.0006914604967460037,
|
| 247 |
+
0.0006622891454026103,
|
| 248 |
+
0.0006857414776459337,
|
| 249 |
+
0.000652576657012105,
|
| 250 |
+
0.0006793858611490578,
|
| 251 |
+
0.0006500781828071922,
|
| 252 |
+
0.0007632372120860965,
|
| 253 |
+
0.000716386508429423,
|
| 254 |
+
0.0005893504258710891,
|
| 255 |
+
0.0007680343778338284,
|
| 256 |
+
0.0006934037839528174,
|
| 257 |
+
0.0006820183829404413,
|
| 258 |
+
0.0006454412871971727,
|
| 259 |
+
0.0007008758827578277,
|
| 260 |
+
0.0007474729768000544,
|
| 261 |
+
0.0007764146046247333,
|
| 262 |
+
0.0007852382119745016,
|
| 263 |
+
0.0006774798943661154,
|
| 264 |
+
0.0006418876245152205,
|
| 265 |
+
0.0006648523325566202,
|
| 266 |
+
0.0007783188938628882,
|
| 267 |
+
0.0006429455650504678,
|
| 268 |
+
0.0006733588024508208,
|
| 269 |
+
0.0006588246207684279,
|
| 270 |
+
0.0007273610704578459,
|
| 271 |
+
0.0007525053631979972,
|
| 272 |
+
0.0006288237415719777,
|
| 273 |
+
0.0007572604657616466,
|
| 274 |
+
0.0006812487181741744,
|
| 275 |
+
0.0006746968219522387,
|
| 276 |
+
0.000672934529138729,
|
| 277 |
+
0.0007103982928674668,
|
| 278 |
+
0.0005602588271722197,
|
| 279 |
+
0.000793174853315577,
|
| 280 |
+
0.0006191824015695602,
|
| 281 |
+
0.0006664175167679787,
|
| 282 |
+
0.0007874653080943972,
|
| 283 |
+
0.00074909275630489,
|
| 284 |
+
0.0006578485679347068,
|
| 285 |
+
0.0006459871935658157,
|
| 286 |
+
0.0008219136553816497,
|
| 287 |
+
0.0007208334910683334,
|
| 288 |
+
0.000678468665573746,
|
| 289 |
+
0.0006136548682115972,
|
| 290 |
+
0.0007637592195533216,
|
| 291 |
+
0.0006477482593618333,
|
| 292 |
+
0.0006849883787799627,
|
| 293 |
+
0.0006521391810383648,
|
| 294 |
+
0.0006545360886957497,
|
| 295 |
+
0.0007145037874579429,
|
| 296 |
+
0.0007451607042457908,
|
| 297 |
+
0.0007188231300096959,
|
| 298 |
+
0.0006218754651490599,
|
| 299 |
+
0.000703996712109074,
|
| 300 |
+
0.0007775819406379014,
|
| 301 |
+
0.0006449525745119899,
|
| 302 |
+
0.0006602043623570353,
|
| 303 |
+
0.0006931156013160944,
|
| 304 |
+
0.0007445168029516935,
|
| 305 |
+
0.0006263144640251994,
|
| 306 |
+
0.0006404199812095612,
|
| 307 |
+
0.0006179862492717802,
|
| 308 |
+
0.0006029122194740922,
|
| 309 |
+
0.0007135167252272367,
|
| 310 |
+
0.0008202092035207898,
|
| 311 |
+
0.0007280001451727003,
|
| 312 |
+
0.0006354285136330873,
|
| 313 |
+
0.0006363408360630274,
|
| 314 |
+
0.0006518895481713116,
|
| 315 |
+
0.0006648174265865236,
|
| 316 |
+
0.0006375208648387343,
|
| 317 |
+
0.0006716851249802858,
|
| 318 |
+
0.0006557911296840757,
|
| 319 |
+
0.0006742915860377252,
|
| 320 |
+
0.0006139701569918543,
|
| 321 |
+
0.0006810006604064256,
|
| 322 |
+
0.0007035259867552668,
|
| 323 |
+
0.0006842188269365579,
|
| 324 |
+
0.0007274644321296364,
|
| 325 |
+
0.0007243382057640701,
|
| 326 |
+
0.0006514342222362757,
|
| 327 |
+
0.0006960602710023523,
|
| 328 |
+
0.0006714880187064409,
|
| 329 |
+
0.0007645141333341599,
|
| 330 |
+
0.0006267998926341534,
|
| 331 |
+
0.000549076838651672,
|
| 332 |
+
0.0006075271510053426,
|
| 333 |
+
0.0006537716963794083,
|
| 334 |
+
0.000815089502139017,
|
| 335 |
+
0.0006254381139297039,
|
| 336 |
+
0.0006689858040772378,
|
| 337 |
+
0.0007458416512235999,
|
| 338 |
+
0.0007134967995807528,
|
| 339 |
+
0.0006505503284279257,
|
| 340 |
+
0.0006866357126273215,
|
| 341 |
+
0.0007449472474399954,
|
| 342 |
+
0.0007994995731860399,
|
| 343 |
+
0.0006305062095634639,
|
| 344 |
+
0.0007771572261117399,
|
| 345 |
+
0.0005861255468335003,
|
| 346 |
+
0.0006490461132489145,
|
| 347 |
+
0.0006539240025449544,
|
| 348 |
+
0.0006622201285790652,
|
| 349 |
+
0.0007105018536094576,
|
| 350 |
+
0.0006533806992229074,
|
| 351 |
+
0.0007010934164281934,
|
| 352 |
+
0.0007012153195682914,
|
| 353 |
+
0.000755780409090221,
|
| 354 |
+
0.0008057881472632289,
|
| 355 |
+
0.0006859079585410655,
|
| 356 |
+
0.0007089377380907535,
|
| 357 |
+
0.0006428735004737974,
|
| 358 |
+
0.0005661467951722444,
|
| 359 |
+
0.0007157801324501633,
|
| 360 |
+
0.0007247704407200217,
|
| 361 |
+
0.0005991804727818817,
|
| 362 |
+
0.0007261993235442788,
|
| 363 |
+
0.0007164769223891199,
|
| 364 |
+
0.0006233677617274225,
|
| 365 |
+
0.0006663454335648567,
|
| 366 |
+
0.0006438348535448313,
|
| 367 |
+
0.0007517258368898183,
|
| 368 |
+
0.000670062496792525,
|
| 369 |
+
0.0006267531984485686,
|
| 370 |
+
0.0006254192080814391,
|
| 371 |
+
0.0005945080297533423,
|
| 372 |
+
0.0005711148411501199,
|
| 373 |
+
0.000689375870861113,
|
| 374 |
+
0.0006672603124752641,
|
| 375 |
+
0.000663038503844291,
|
| 376 |
+
0.0007116432231850922,
|
| 377 |
+
0.0006037923658732324,
|
| 378 |
+
0.0006644873775076121,
|
| 379 |
+
0.0006262161186896265,
|
| 380 |
+
0.0006195592042058706,
|
| 381 |
+
0.0006777037784922868,
|
| 382 |
+
0.0006847626541275531,
|
| 383 |
+
0.0006523103034123778,
|
| 384 |
+
0.0007164986117277294,
|
| 385 |
+
0.0006631425581872463,
|
| 386 |
+
0.000611533661140129,
|
| 387 |
+
0.0006662521406542509,
|
| 388 |
+
0.0005739644973073155,
|
| 389 |
+
0.0006848217360675335,
|
| 390 |
+
0.0007152285834308714,
|
| 391 |
+
0.0007286195596680046,
|
| 392 |
+
0.0006802385661285371,
|
| 393 |
+
0.0008059242740273475,
|
| 394 |
+
0.0006603163795080036,
|
| 395 |
+
0.0006776861753314733,
|
| 396 |
+
0.0006911346525885165,
|
| 397 |
+
0.000688123315339908,
|
| 398 |
+
0.0007451189926359803,
|
| 399 |
+
0.0007950245263054967,
|
| 400 |
+
0.0006888545083347708,
|
| 401 |
+
0.0007957801409065724,
|
| 402 |
+
0.0006623369629960508,
|
| 403 |
+
0.0006840018113143742,
|
| 404 |
+
0.0007711689337156713,
|
| 405 |
+
0.0007078714692033827,
|
| 406 |
+
0.0006231818103697151,
|
| 407 |
+
0.000626515190815553,
|
| 408 |
+
0.0007352589105721563,
|
| 409 |
+
0.0006479688861873,
|
| 410 |
+
0.0006479219882749021,
|
| 411 |
+
0.0006612318241968751,
|
| 412 |
+
0.000716126230545342,
|
| 413 |
+
0.0006498502742033451,
|
| 414 |
+
0.0006611649214755743,
|
| 415 |
+
0.0006742634135298431,
|
| 416 |
+
0.000718388466630131,
|
| 417 |
+
0.000629462479846552,
|
| 418 |
+
0.0006513466988690198,
|
| 419 |
+
0.0007121214945800602,
|
| 420 |
+
0.0006793108757119626,
|
| 421 |
+
0.0006632674182765186,
|
| 422 |
+
0.0007429877505637705,
|
| 423 |
+
0.0006127848429605364,
|
| 424 |
+
0.0006982322025578469
|
| 425 |
+
],
|
| 426 |
+
"val_loss": [
|
| 427 |
+
0.026283978950232267,
|
| 428 |
+
0.010537958587519825,
|
| 429 |
+
0.005959302419796586,
|
| 430 |
+
0.004318590275943279,
|
| 431 |
+
0.0030702243675477803,
|
| 432 |
+
0.002922787476563826,
|
| 433 |
+
0.0025521343923173845,
|
| 434 |
+
0.002321083302376792,
|
| 435 |
+
0.00798261584714055,
|
| 436 |
+
0.0021031112992204726,
|
| 437 |
+
0.0017658683937042952,
|
| 438 |
+
0.0014127750473562628,
|
| 439 |
+
0.0013168116856832057,
|
| 440 |
+
0.0011439024819992483,
|
| 441 |
+
0.001063952426193282,
|
| 442 |
+
0.001072018058039248,
|
| 443 |
+
0.0009205675742123276,
|
| 444 |
+
0.0008688613161211833,
|
| 445 |
+
0.0009406320168636739,
|
| 446 |
+
0.0008466239087283611,
|
| 447 |
+
0.0008537380199413747,
|
| 448 |
+
0.000853416626341641,
|
| 449 |
+
0.000729498322471045,
|
| 450 |
+
0.00078472716268152,
|
| 451 |
+
0.0008798330236459151,
|
| 452 |
+
0.0008069790492299944,
|
| 453 |
+
0.0006943310581846163,
|
| 454 |
+
0.0007332871900871396,
|
| 455 |
+
0.0007314854301512241,
|
| 456 |
+
0.0007045981183182448,
|
| 457 |
+
0.000687224353896454,
|
| 458 |
+
0.0007475645543308929,
|
| 459 |
+
0.0006659597856923938,
|
| 460 |
+
0.0006857705011498183,
|
| 461 |
+
0.0006536507571581751,
|
| 462 |
+
0.000658027915051207,
|
| 463 |
+
0.0006836809479864314,
|
| 464 |
+
0.0006523653282783926,
|
| 465 |
+
0.000709256695699878,
|
| 466 |
+
0.0006997311138547957,
|
| 467 |
+
0.0005934788059676066,
|
| 468 |
+
0.0006055301346350461,
|
| 469 |
+
0.0006064220069674775,
|
| 470 |
+
0.0005584962927969173,
|
| 471 |
+
0.0005540750862564892,
|
| 472 |
+
0.0005318818002706394,
|
| 473 |
+
0.0007774600817356259,
|
| 474 |
+
0.0006748703453922644,
|
| 475 |
+
0.0005385820259107277,
|
| 476 |
+
0.0005968235345790163,
|
| 477 |
+
0.0005532360810320824,
|
| 478 |
+
0.0006339181418297812,
|
| 479 |
+
0.0005367491394281387,
|
| 480 |
+
0.0006781233678339049,
|
| 481 |
+
0.0005130512581672519,
|
| 482 |
+
0.0005704629147658125,
|
| 483 |
+
0.0005057957168901339,
|
| 484 |
+
0.0005435101484181359,
|
| 485 |
+
0.0005702158086933196,
|
| 486 |
+
0.0009046931809280068,
|
| 487 |
+
0.0005238619050942361,
|
| 488 |
+
0.0005645100609399378,
|
| 489 |
+
0.000608793823630549,
|
| 490 |
+
0.0005197777500143275,
|
| 491 |
+
0.0005501889463630505,
|
| 492 |
+
0.0006751372711732984,
|
| 493 |
+
0.0005121481517562643,
|
| 494 |
+
0.0005055325964349322,
|
| 495 |
+
0.0004872201825492084,
|
| 496 |
+
0.0005607069324469194,
|
| 497 |
+
0.000548997413716279,
|
| 498 |
+
0.0005114811647217721,
|
| 499 |
+
0.00047762215399416164,
|
| 500 |
+
0.0005107681645313278,
|
| 501 |
+
0.0004923850938212126,
|
| 502 |
+
0.0004710483699454926,
|
| 503 |
+
0.00046989237307570875,
|
| 504 |
+
0.00046080788888502866,
|
| 505 |
+
0.0005168660427443683,
|
| 506 |
+
0.0004771416643052362,
|
| 507 |
+
0.0004560653178486973,
|
| 508 |
+
0.0005649148661177605,
|
| 509 |
+
0.0005391296654124744,
|
| 510 |
+
0.0004907030379399657,
|
| 511 |
+
0.0005036115035181865,
|
| 512 |
+
0.0004704201128333807,
|
| 513 |
+
0.0005726730159949511,
|
| 514 |
+
0.0005614580732071772,
|
| 515 |
+
0.0005125643219798803,
|
| 516 |
+
0.00044861620699521154,
|
| 517 |
+
0.000577094906475395,
|
| 518 |
+
0.0005418930377345532,
|
| 519 |
+
0.0004974581970600411,
|
| 520 |
+
0.00046336406376212835,
|
| 521 |
+
0.000707709405105561,
|
| 522 |
+
0.0004941070437780581,
|
| 523 |
+
0.0005205552661209367,
|
| 524 |
+
0.0005583390448009595,
|
| 525 |
+
0.0005721991619793698,
|
| 526 |
+
0.0005261179321678355,
|
| 527 |
+
0.000609052469371818,
|
| 528 |
+
0.0004880507694906555,
|
| 529 |
+
0.0005032297194702551,
|
| 530 |
+
0.0004887427785433829,
|
| 531 |
+
0.0005338089686119929,
|
| 532 |
+
0.0005469665047712624,
|
| 533 |
+
0.0004412563575897366,
|
| 534 |
+
0.0004426024461281486,
|
| 535 |
+
0.0004328079739934765,
|
| 536 |
+
0.0004195217043161392,
|
| 537 |
+
0.0004377869190648198,
|
| 538 |
+
0.000455352412245702,
|
| 539 |
+
0.0004913340380880982,
|
| 540 |
+
0.000440710224211216,
|
| 541 |
+
0.00041336733556818217,
|
| 542 |
+
0.00044693600648315623,
|
| 543 |
+
0.00041551675531081855,
|
| 544 |
+
0.00044102113315602764,
|
| 545 |
+
0.00043793374788947403,
|
| 546 |
+
0.0004745983023894951,
|
| 547 |
+
0.00045799418148817495,
|
| 548 |
+
0.000430036336183548,
|
| 549 |
+
0.000501969552715309,
|
| 550 |
+
0.0004814484782400541,
|
| 551 |
+
0.0004802903131349012,
|
| 552 |
+
0.000493861545692198,
|
| 553 |
+
0.00047019821067806333,
|
| 554 |
+
0.0004680276397266425,
|
| 555 |
+
0.00045596105337608606,
|
| 556 |
+
0.0004292323501431383,
|
| 557 |
+
0.0004343724067439325,
|
| 558 |
+
0.00042358218342997134,
|
| 559 |
+
0.00041622288699727505,
|
| 560 |
+
0.0004354975535534322,
|
| 561 |
+
0.0004036181344417855,
|
| 562 |
+
0.0004004666334367357,
|
| 563 |
+
0.0004032665674458258,
|
| 564 |
+
0.00040305215952685103,
|
| 565 |
+
0.0004292845114832744,
|
| 566 |
+
0.0004382201223052107,
|
| 567 |
+
0.00041671812505228445,
|
| 568 |
+
0.00040933004493126646,
|
| 569 |
+
0.00043268698937026784,
|
| 570 |
+
0.00043867815838893875,
|
| 571 |
+
0.0004177701994194649,
|
| 572 |
+
0.0004213192587485537,
|
| 573 |
+
0.0004091787996003404,
|
| 574 |
+
0.0004461346979951486,
|
| 575 |
+
0.00043091231782455,
|
| 576 |
+
0.0004307736744522117,
|
| 577 |
+
0.0004129378357902169,
|
| 578 |
+
0.0004141695753787644,
|
| 579 |
+
0.0004116286654607393,
|
| 580 |
+
0.00042365817353129387,
|
| 581 |
+
0.0004018377876491286,
|
| 582 |
+
0.00040019239531829953,
|
| 583 |
+
0.00041437816253164783,
|
| 584 |
+
0.00041005983075592667,
|
| 585 |
+
0.00041580238030292094,
|
| 586 |
+
0.0004019175103167072,
|
| 587 |
+
0.00040131482819560915,
|
| 588 |
+
0.0003992196870967746,
|
| 589 |
+
0.0004018068575533107,
|
| 590 |
+
0.0004166790604358539,
|
| 591 |
+
0.00040348103357246146,
|
| 592 |
+
0.0003997729072580114,
|
| 593 |
+
0.00040286077273776755,
|
| 594 |
+
0.0003969812532886863,
|
| 595 |
+
0.0004026802707812749,
|
| 596 |
+
0.0004093574025318958,
|
| 597 |
+
0.00039559278229717165,
|
| 598 |
+
0.0004095621552551165,
|
| 599 |
+
0.00040145923412637785,
|
| 600 |
+
0.00039678922621533275,
|
| 601 |
+
0.00039834682684158906,
|
| 602 |
+
0.0003948314770241268,
|
| 603 |
+
0.0004108048233320005,
|
| 604 |
+
0.0003938540176022798,
|
| 605 |
+
0.0003955471911467612,
|
| 606 |
+
0.00040893191908253357,
|
| 607 |
+
0.00042223210039082915,
|
| 608 |
+
0.00040334514051210135,
|
| 609 |
+
0.0004029693373013288,
|
| 610 |
+
0.00039775786717655137,
|
| 611 |
+
0.00039851420297054574,
|
| 612 |
+
0.00040306289156433195,
|
| 613 |
+
0.0004071658622706309,
|
| 614 |
+
0.00039456060039810836,
|
| 615 |
+
0.00041818157478701323,
|
| 616 |
+
0.00039590123924426734,
|
| 617 |
+
0.0003977357118856162,
|
| 618 |
+
0.000406102066335734,
|
| 619 |
+
0.00039093909435905516,
|
| 620 |
+
0.00039042312710080296,
|
| 621 |
+
0.0004146502396906726,
|
| 622 |
+
0.00041177591629093513,
|
| 623 |
+
0.0004717590272775851,
|
| 624 |
+
0.0003988183743786067,
|
| 625 |
+
0.0004788689475390129,
|
| 626 |
+
0.0004140931196161546,
|
| 627 |
+
0.00042259956535417587,
|
| 628 |
+
0.00043693769111996517,
|
| 629 |
+
0.00039867856685305014,
|
| 630 |
+
0.0003983255155617371,
|
| 631 |
+
0.0004161711040069349,
|
| 632 |
+
0.00041662639705464244,
|
| 633 |
+
0.00039529694186057895,
|
| 634 |
+
0.00039029143954394385,
|
| 635 |
+
0.00041320865420857444,
|
| 636 |
+
0.00039974143874133006,
|
| 637 |
+
0.0004102266757399775,
|
| 638 |
+
0.0004143093538004905,
|
| 639 |
+
0.00041286968189524487,
|
| 640 |
+
0.0003981742629548535,
|
| 641 |
+
0.0004056874749949202,
|
| 642 |
+
0.000403731653932482,
|
| 643 |
+
0.00043647543498082086,
|
| 644 |
+
0.00039627250225748867,
|
| 645 |
+
0.0004104302861378528,
|
| 646 |
+
0.0003942073817597702,
|
| 647 |
+
0.00041250048525398597,
|
| 648 |
+
0.00039834687777329236,
|
| 649 |
+
0.00043162464135093614,
|
| 650 |
+
0.00039662876224610955,
|
| 651 |
+
0.00041321328171761706,
|
| 652 |
+
0.000387932697776705,
|
| 653 |
+
0.0003857448245980777,
|
| 654 |
+
0.000388286956876982,
|
| 655 |
+
0.0003862462253891863,
|
| 656 |
+
0.00038777598092565313,
|
| 657 |
+
0.000389896405977197,
|
| 658 |
+
0.0003898645954905078,
|
| 659 |
+
0.00040376747347181663,
|
| 660 |
+
0.0003993080899817869,
|
| 661 |
+
0.00039950905920704827,
|
| 662 |
+
0.000385721490602009,
|
| 663 |
+
0.00039528713386971503,
|
| 664 |
+
0.00040075962169794366,
|
| 665 |
+
0.0003940677852369845,
|
| 666 |
+
0.0004217733658151701,
|
| 667 |
+
0.0003954594430979341,
|
| 668 |
+
0.00038527110154973343,
|
| 669 |
+
0.00040838987479219213,
|
| 670 |
+
0.000391678542655427,
|
| 671 |
+
0.00039305735117523,
|
| 672 |
+
0.0003930293896701187,
|
| 673 |
+
0.0003875085894833319,
|
| 674 |
+
0.00039281595672946423,
|
| 675 |
+
0.0003946621291106567,
|
| 676 |
+
0.0003979563698521815,
|
| 677 |
+
0.00038824624061817303,
|
| 678 |
+
0.0003968530218116939,
|
| 679 |
+
0.00039239700709003955,
|
| 680 |
+
0.0004076328768860549,
|
| 681 |
+
0.0003923411277355626,
|
| 682 |
+
0.0003990658660768531,
|
| 683 |
+
0.00038916256016818807,
|
| 684 |
+
0.00038839830085635185,
|
| 685 |
+
0.00038558003143407404,
|
| 686 |
+
0.0003852557420032099,
|
| 687 |
+
0.0003904681507265195,
|
| 688 |
+
0.000385261038900353,
|
| 689 |
+
0.0003933589468942955,
|
| 690 |
+
0.00040730789623921737,
|
| 691 |
+
0.0003892131644533947,
|
| 692 |
+
0.00041210317431250587,
|
| 693 |
+
0.00039447806193493307,
|
| 694 |
+
0.0003981174377258867,
|
| 695 |
+
0.0003910231971531175,
|
| 696 |
+
0.00038697531999787316,
|
| 697 |
+
0.0003934121778002009,
|
| 698 |
+
0.0003949806050513871,
|
| 699 |
+
0.0003891239102813415,
|
| 700 |
+
0.0003858378258883022,
|
| 701 |
+
0.0003928807273041457,
|
| 702 |
+
0.0003834553572232835,
|
| 703 |
+
0.00039065848250174895,
|
| 704 |
+
0.00038706981285940856,
|
| 705 |
+
0.00040459339652443305,
|
| 706 |
+
0.00038686200423398986,
|
| 707 |
+
0.0003873252135235816,
|
| 708 |
+
0.00038733251130906865,
|
| 709 |
+
0.00038843806396471336,
|
| 710 |
+
0.0003841337747871876,
|
| 711 |
+
0.0003928846199414693,
|
| 712 |
+
0.0003877473282045685,
|
| 713 |
+
0.00039527210174128413,
|
| 714 |
+
0.00039276352617889643,
|
| 715 |
+
0.00040032869583228603,
|
| 716 |
+
0.0003874816247844137,
|
| 717 |
+
0.00039226705848705024,
|
| 718 |
+
0.0003848887499771081,
|
| 719 |
+
0.0003827504988294095,
|
| 720 |
+
0.0003840965437120758,
|
| 721 |
+
0.0003875541078741662,
|
| 722 |
+
0.00038782799674663693,
|
| 723 |
+
0.00039483026921516284,
|
| 724 |
+
0.0003954802450607531,
|
| 725 |
+
0.00039886472950456664,
|
| 726 |
+
0.0003852002919302322,
|
| 727 |
+
0.00038393799331970513,
|
| 728 |
+
0.00039264457154786214,
|
| 729 |
+
0.00038880870124557987,
|
| 730 |
+
0.00038403941289288923,
|
| 731 |
+
0.0003959549139835872,
|
| 732 |
+
0.00038740580930607393,
|
| 733 |
+
0.0004010066913906485,
|
| 734 |
+
0.0003853244779747911,
|
| 735 |
+
0.00038515239430125803,
|
| 736 |
+
0.0003829313864116557,
|
| 737 |
+
0.0003830123823718168,
|
| 738 |
+
0.0004027180475532077,
|
| 739 |
+
0.00038423126534326,
|
| 740 |
+
0.0003852177978842519,
|
| 741 |
+
0.0003836171672446653,
|
| 742 |
+
0.00040225585689768195,
|
| 743 |
+
0.00038657333061564714,
|
| 744 |
+
0.00038358845631591976,
|
| 745 |
+
0.00039181223110063,
|
| 746 |
+
0.00038461833173641935,
|
| 747 |
+
0.00038864243833813816,
|
| 748 |
+
0.00039109987119445577,
|
| 749 |
+
0.0003838118282146752,
|
| 750 |
+
0.00038235141983022913,
|
| 751 |
+
0.00038506279088323936,
|
| 752 |
+
0.00039419539825757965,
|
| 753 |
+
0.00038600945117650554,
|
| 754 |
+
0.00040805581375025213,
|
| 755 |
+
0.0003833629816654138,
|
| 756 |
+
0.0003909634397132322,
|
| 757 |
+
0.0003884202160406858,
|
| 758 |
+
0.00039047216705512255,
|
| 759 |
+
0.00039329990977421403,
|
| 760 |
+
0.00038405536906793714,
|
| 761 |
+
0.00038940437661949545,
|
| 762 |
+
0.0003839334531221539,
|
| 763 |
+
0.0003823609004030004,
|
| 764 |
+
0.0003837793046841398,
|
| 765 |
+
0.00038473025779239833,
|
| 766 |
+
0.0003857807532767765,
|
| 767 |
+
0.00039129767537815496,
|
| 768 |
+
0.00038333000702550635,
|
| 769 |
+
0.0003865882390527986,
|
| 770 |
+
0.0003871661829180084,
|
| 771 |
+
0.0003875026377500035,
|
| 772 |
+
0.0003911648673238233,
|
| 773 |
+
0.00038401834899559617,
|
| 774 |
+
0.00038651598151773214,
|
| 775 |
+
0.00039078201371012256,
|
| 776 |
+
0.0003878789284499362,
|
| 777 |
+
0.00038271558878477663,
|
| 778 |
+
0.00038183890865184367,
|
| 779 |
+
0.00038885732647031546,
|
| 780 |
+
0.0003897893548128195,
|
| 781 |
+
0.00038326034700730816,
|
| 782 |
+
0.00038803684583399445,
|
| 783 |
+
0.000385132480005268,
|
| 784 |
+
0.00038872953155077994,
|
| 785 |
+
0.0003861527729895897,
|
| 786 |
+
0.00039482369902543724,
|
| 787 |
+
0.00038205798773560673,
|
| 788 |
+
0.00038316800782922655,
|
| 789 |
+
0.0003934424094040878,
|
| 790 |
+
0.00038511762249981984,
|
| 791 |
+
0.0003829495399259031,
|
| 792 |
+
0.0003894798501278274,
|
| 793 |
+
0.00039356965135084465,
|
| 794 |
+
0.0003851508954539895,
|
| 795 |
+
0.00038674292591167614,
|
| 796 |
+
0.00038693771784892306,
|
| 797 |
+
0.0003854593524010852,
|
| 798 |
+
0.0003883770477841608,
|
| 799 |
+
0.0003899165676557459,
|
| 800 |
+
0.00038741219032090157,
|
| 801 |
+
0.0003845256142085418,
|
| 802 |
+
0.0003835610332316719,
|
| 803 |
+
0.0004107880231458694,
|
| 804 |
+
0.00039684282819507644,
|
| 805 |
+
0.0004015684098703787,
|
| 806 |
+
0.0003834615199593827,
|
| 807 |
+
0.0003829648485407233,
|
| 808 |
+
0.00038937955832807347,
|
| 809 |
+
0.00039334454049821943,
|
| 810 |
+
0.0003891555606969632,
|
| 811 |
+
0.00039172110700747,
|
| 812 |
+
0.0003932657346013002,
|
| 813 |
+
0.0003845909595838748,
|
| 814 |
+
0.0003928291116608307,
|
| 815 |
+
0.0003906611236743629,
|
| 816 |
+
0.000388686872611288,
|
| 817 |
+
0.00038420075725298375,
|
| 818 |
+
0.0003835150055238046,
|
| 819 |
+
0.00038489006692543626,
|
| 820 |
+
0.00038466444675577804,
|
| 821 |
+
0.0003854309252346866,
|
| 822 |
+
0.00038482085074065253,
|
| 823 |
+
0.00038886912079760805,
|
| 824 |
+
0.0003943712363252416,
|
| 825 |
+
0.0003830662608379498,
|
| 826 |
+
0.00039585917693329975,
|
| 827 |
+
0.0003851762230624445,
|
| 828 |
+
0.0003863131278194487
|
| 829 |
+
],
|
| 830 |
+
"train_data_loss": [
|
| 831 |
+
0.06111632943153381,
|
| 832 |
+
0.022610593140125274,
|
| 833 |
+
0.011463590860366822,
|
| 834 |
+
0.007689229417592287,
|
| 835 |
+
0.005834103915840387,
|
| 836 |
+
0.00570264708250761,
|
| 837 |
+
0.005017013922333717,
|
| 838 |
+
0.004266595495864749,
|
| 839 |
+
0.0040910478122532365,
|
| 840 |
+
0.004168967669829726,
|
| 841 |
+
0.0032320022955536843,
|
| 842 |
+
0.0021958278166130185,
|
| 843 |
+
0.001956210043281317,
|
| 844 |
+
0.0016189745021983981,
|
| 845 |
+
0.0014535277104005219,
|
| 846 |
+
0.001414215660188347,
|
| 847 |
+
0.001328134338837117,
|
| 848 |
+
0.0013814367516897619,
|
| 849 |
+
0.0014048888022080064,
|
| 850 |
+
0.0013899376825429498,
|
| 851 |
+
0.0011791874142363667,
|
| 852 |
+
0.001211162854451686,
|
| 853 |
+
0.0010694950213655829,
|
| 854 |
+
0.0010189943108707667,
|
| 855 |
+
0.0010792856547050178,
|
| 856 |
+
0.0011345495865680278,
|
| 857 |
+
0.0009482379734981805,
|
| 858 |
+
0.0011318472865968943,
|
| 859 |
+
0.0011008738540112973,
|
| 860 |
+
0.0011046079988591374,
|
| 861 |
+
0.0010339000588282944,
|
| 862 |
+
0.00091316765290685,
|
| 863 |
+
0.0010238043614663184,
|
| 864 |
+
0.0009843404218554497,
|
| 865 |
+
0.0009266045759432018,
|
| 866 |
+
0.0008129035646561533,
|
| 867 |
+
0.0010064059833530337,
|
| 868 |
+
0.0009916366392280905,
|
| 869 |
+
0.0009183288726489991,
|
| 870 |
+
0.0010460032569244505,
|
| 871 |
+
0.0008236625941935926,
|
| 872 |
+
0.0007431767729576677,
|
| 873 |
+
0.0009142599301412702,
|
| 874 |
+
0.0008733041281811893,
|
| 875 |
+
0.0008099981455598026,
|
| 876 |
+
0.0007963566412217915,
|
| 877 |
+
0.000851825758581981,
|
| 878 |
+
0.0008543513529002667,
|
| 879 |
+
0.0007053828809875994,
|
| 880 |
+
0.0007426441961433739,
|
| 881 |
+
0.0009265714627690613,
|
| 882 |
+
0.0007511991716455668,
|
| 883 |
+
0.0008437470160424709,
|
| 884 |
+
0.000600542580941692,
|
| 885 |
+
0.0007313118712045252,
|
| 886 |
+
0.000605277749709785,
|
| 887 |
+
0.0008169321122113615,
|
| 888 |
+
0.0007159240893088281,
|
| 889 |
+
0.0008277881855610758,
|
| 890 |
+
0.0007249951653648168,
|
| 891 |
+
0.0008671594539191574,
|
| 892 |
+
0.0007262648869073018,
|
| 893 |
+
0.0007478684931993485,
|
| 894 |
+
0.0007032422884367406,
|
| 895 |
+
0.0006882490398129449,
|
| 896 |
+
0.000717442708555609,
|
| 897 |
+
0.000759283791994676,
|
| 898 |
+
0.0007924494112376124,
|
| 899 |
+
0.0005203706800239161,
|
| 900 |
+
0.0006142496858956292,
|
| 901 |
+
0.0006249511800706386,
|
| 902 |
+
0.0006611913640517742,
|
| 903 |
+
0.0006428097171010449,
|
| 904 |
+
0.0006750389892840758,
|
| 905 |
+
0.0006663104542531073,
|
| 906 |
+
0.000596378244808875,
|
| 907 |
+
0.0006057329382747412,
|
| 908 |
+
0.000554600958712399,
|
| 909 |
+
0.0005564349913038313,
|
| 910 |
+
0.0005631430906942115,
|
| 911 |
+
0.0005051506479503587,
|
| 912 |
+
0.0006565963639877737,
|
| 913 |
+
0.0006301928736502305,
|
| 914 |
+
0.0007102849654620513,
|
| 915 |
+
0.0005690045322990045,
|
| 916 |
+
0.000631097424775362,
|
| 917 |
+
0.0006392812548438087,
|
| 918 |
+
0.0005283830058760941,
|
| 919 |
+
0.0005664813949260861,
|
| 920 |
+
0.0005404965661000461,
|
| 921 |
+
0.0007329248893074692,
|
| 922 |
+
0.0006166069553000853,
|
| 923 |
+
0.0005655786395072936,
|
| 924 |
+
0.0004584922845242545,
|
| 925 |
+
0.0007378292316570878,
|
| 926 |
+
0.0005565923231188208,
|
| 927 |
+
0.0005042030126787723,
|
| 928 |
+
0.0006413497671019286,
|
| 929 |
+
0.0005445062555372715,
|
| 930 |
+
0.0005829047330189497,
|
| 931 |
+
0.000639097816310823,
|
| 932 |
+
0.0006410515931202099,
|
| 933 |
+
0.0006729495921172202,
|
| 934 |
+
0.0005729772936319932,
|
| 935 |
+
0.00047854945703875275,
|
| 936 |
+
0.0005694313073763623,
|
| 937 |
+
0.0005733832920668646,
|
| 938 |
+
0.00048158225079532714,
|
| 939 |
+
0.000408885040669702,
|
| 940 |
+
0.0004747627273900434,
|
| 941 |
+
0.0005629531759768724,
|
| 942 |
+
0.000537846113438718,
|
| 943 |
+
0.0005594389163888991,
|
| 944 |
+
0.000497185640851967,
|
| 945 |
+
0.000571522144600749,
|
| 946 |
+
0.0005186308943666518,
|
| 947 |
+
0.0003814709500875324,
|
| 948 |
+
0.00044413551688194275,
|
| 949 |
+
0.0005130177806131541,
|
| 950 |
+
0.0005261707497993484,
|
| 951 |
+
0.0005703791824635118,
|
| 952 |
+
0.0005887022870592773,
|
| 953 |
+
0.0005153859499841928,
|
| 954 |
+
0.0007296169176697731,
|
| 955 |
+
0.0005363404453964904,
|
| 956 |
+
0.0005727375944843515,
|
| 957 |
+
0.0004898820287780836,
|
| 958 |
+
0.000519198140827939,
|
| 959 |
+
0.0004940752277616411,
|
| 960 |
+
0.0006206767400726676,
|
| 961 |
+
0.00045496375707443804,
|
| 962 |
+
0.000493877975968644,
|
| 963 |
+
0.0004537338484078646,
|
| 964 |
+
0.0005948455136967823,
|
| 965 |
+
0.00039338122704066334,
|
| 966 |
+
0.0004589460021816194,
|
| 967 |
+
0.0004080594750121236,
|
| 968 |
+
0.00042077398858964444,
|
| 969 |
+
0.0006537522043799981,
|
| 970 |
+
0.0005689837958198041,
|
| 971 |
+
0.00047009934263769537,
|
| 972 |
+
0.0004535078490152955,
|
| 973 |
+
0.0004300638078711927,
|
| 974 |
+
0.0004624957707710564,
|
| 975 |
+
0.0005870564776705578,
|
| 976 |
+
0.00045413391082547603,
|
| 977 |
+
0.0005388701322954149,
|
| 978 |
+
0.00037020818155724555,
|
| 979 |
+
0.0003798494557850063,
|
| 980 |
+
0.0004939011321403086,
|
| 981 |
+
0.0004258868045872077,
|
| 982 |
+
0.00047798498591873796,
|
| 983 |
+
0.0005132798774866387,
|
| 984 |
+
0.0005418015486793593,
|
| 985 |
+
0.000391564748424571,
|
| 986 |
+
0.0004303595452802256,
|
| 987 |
+
0.0005348569445777684,
|
| 988 |
+
0.00039989388780668376,
|
| 989 |
+
0.0004243733605835587,
|
| 990 |
+
0.0003882594263995998,
|
| 991 |
+
0.00041267815977334975,
|
| 992 |
+
0.0005310245009604842,
|
| 993 |
+
0.00048007640929427,
|
| 994 |
+
0.0003512048663105816,
|
| 995 |
+
0.00043406582553870974,
|
| 996 |
+
0.0004168531703180633,
|
| 997 |
+
0.0004196171060902998,
|
| 998 |
+
0.00039875314105302094,
|
| 999 |
+
0.0005135321139823645,
|
| 1000 |
+
0.0003728672882425599,
|
| 1001 |
+
0.00040361354243941607,
|
| 1002 |
+
0.0005594447546172887,
|
| 1003 |
+
0.0003976003377465531,
|
| 1004 |
+
0.00035834084497764706,
|
| 1005 |
+
0.000403238728758879,
|
| 1006 |
+
0.0004792337608523667,
|
| 1007 |
+
0.0003386170108569786,
|
| 1008 |
+
0.0003735307126771659,
|
| 1009 |
+
0.00034900786413345487,
|
| 1010 |
+
0.0004467714158818126,
|
| 1011 |
+
0.0005163575173355639,
|
| 1012 |
+
0.0003273250907659531,
|
| 1013 |
+
0.00037604146637022496,
|
| 1014 |
+
0.0004178599431179464,
|
| 1015 |
+
0.0004167048999806866,
|
| 1016 |
+
0.0003420985653065145,
|
| 1017 |
+
0.0003435955318855122,
|
| 1018 |
+
0.00040012017474509774,
|
| 1019 |
+
0.0004499701008899137,
|
| 1020 |
+
0.00038878290099091827,
|
| 1021 |
+
0.00040934043849119914,
|
| 1022 |
+
0.00041699027235154063,
|
| 1023 |
+
0.00045416245877277105,
|
| 1024 |
+
0.00035941804060712456,
|
| 1025 |
+
0.00027033825783291833,
|
| 1026 |
+
0.00046619017550256106,
|
| 1027 |
+
0.0004832038906170055,
|
| 1028 |
+
0.0004278036579489708,
|
| 1029 |
+
0.0005549707694444806,
|
| 1030 |
+
0.0003126340248854831,
|
| 1031 |
+
0.00047633283014874905,
|
| 1032 |
+
0.0004091521684313193,
|
| 1033 |
+
0.0004381974315037951,
|
| 1034 |
+
0.00045444103103363886,
|
| 1035 |
+
0.0005142191686900333,
|
| 1036 |
+
0.0004966847517061979,
|
| 1037 |
+
0.00044013060338329526,
|
| 1038 |
+
0.00038002003449946645,
|
| 1039 |
+
0.0005402848724042997,
|
| 1040 |
+
0.00042452524474356323,
|
| 1041 |
+
0.00041636235895566644,
|
| 1042 |
+
0.0004658939124783501,
|
| 1043 |
+
0.00038565375522011893,
|
| 1044 |
+
0.0003701596526661888,
|
| 1045 |
+
0.0005098755250219256,
|
| 1046 |
+
0.000437792107113637,
|
| 1047 |
+
0.00043023425852879883,
|
| 1048 |
+
0.0004240698862122372,
|
| 1049 |
+
0.0003492091747466475,
|
| 1050 |
+
0.0004922032251488417,
|
| 1051 |
+
0.0003676940343575552,
|
| 1052 |
+
0.0004233681387268007,
|
| 1053 |
+
0.0003865395623142831,
|
| 1054 |
+
0.0003919587598647922,
|
| 1055 |
+
0.0003609595715533942,
|
| 1056 |
+
0.0003855390101671219,
|
| 1057 |
+
0.00035332448809640484,
|
| 1058 |
+
0.00037866915401536973,
|
| 1059 |
+
0.0003506757575087249,
|
| 1060 |
+
0.00046321668138261886,
|
| 1061 |
+
0.00041665853874292227,
|
| 1062 |
+
0.0002891445576096885,
|
| 1063 |
+
0.0004698287887731567,
|
| 1064 |
+
0.0003920018800999969,
|
| 1065 |
+
0.0003838195838034153,
|
| 1066 |
+
0.00034527558484114706,
|
| 1067 |
+
0.0004004773497581482,
|
| 1068 |
+
0.0004460592911345884,
|
| 1069 |
+
0.0004774400725727901,
|
| 1070 |
+
0.00048316487635020167,
|
| 1071 |
+
0.00037865151360165326,
|
| 1072 |
+
0.00034100685734301807,
|
| 1073 |
+
0.0003636537567945197,
|
| 1074 |
+
0.0004790426656836644,
|
| 1075 |
+
0.0003422349423635751,
|
| 1076 |
+
0.00037237264768918976,
|
| 1077 |
+
0.00035927878401707857,
|
| 1078 |
+
0.00042495642905123534,
|
| 1079 |
+
0.0004526011215057224,
|
| 1080 |
+
0.00032811520737595855,
|
| 1081 |
+
0.0004564619541633874,
|
| 1082 |
+
0.0003821966660325415,
|
| 1083 |
+
0.00037347961391787977,
|
| 1084 |
+
0.00037283525365637614,
|
| 1085 |
+
0.0004099928488722071,
|
| 1086 |
+
0.0002609397206106223,
|
| 1087 |
+
0.0004936043510679155,
|
| 1088 |
+
0.0003176469326717779,
|
| 1089 |
+
0.00036575875768903645,
|
| 1090 |
+
0.00048773832910228523,
|
| 1091 |
+
0.0004496339207980782,
|
| 1092 |
+
0.0003576860192697495,
|
| 1093 |
+
0.0003444012656109408,
|
| 1094 |
+
0.0005206863721832633,
|
| 1095 |
+
0.00042083204665686934,
|
| 1096 |
+
0.00037871083070058377,
|
| 1097 |
+
0.00031317147688241675,
|
| 1098 |
+
0.0004630110843572766,
|
| 1099 |
+
0.00034779761568643153,
|
| 1100 |
+
0.0003861369303194806,
|
| 1101 |
+
0.00035131232056301087,
|
| 1102 |
+
0.0003526116325519979,
|
| 1103 |
+
0.00041454613266978413,
|
| 1104 |
+
0.0004466020679683425,
|
| 1105 |
+
0.00041750032105483117,
|
| 1106 |
+
0.0003226558881578967,
|
| 1107 |
+
0.00040290944511070845,
|
| 1108 |
+
0.0004762007808312774,
|
| 1109 |
+
0.0003461087477626279,
|
| 1110 |
+
0.0003599523002048954,
|
| 1111 |
+
0.000392874157987535,
|
| 1112 |
+
0.00044395229837391526,
|
| 1113 |
+
0.00032559183047851546,
|
| 1114 |
+
0.0003410894508124329,
|
| 1115 |
+
0.0003167266916716471,
|
| 1116 |
+
0.00030249347852077333,
|
| 1117 |
+
0.0004144866915885359,
|
| 1118 |
+
0.0005205274271429517,
|
| 1119 |
+
0.0004272672557272017,
|
| 1120 |
+
0.0003356709680519998,
|
| 1121 |
+
0.0003353700891602784,
|
| 1122 |
+
0.0003525414894102141,
|
| 1123 |
+
0.00036397849296918136,
|
| 1124 |
+
0.0003373487788485363,
|
| 1125 |
+
0.00037210039154160767,
|
| 1126 |
+
0.0003548764149309136,
|
| 1127 |
+
0.00037402055982965976,
|
| 1128 |
+
0.0003128998368629254,
|
| 1129 |
+
0.00038001712586265056,
|
| 1130 |
+
0.00040457836439600214,
|
| 1131 |
+
0.0003853932477068156,
|
| 1132 |
+
0.00042645255802199245,
|
| 1133 |
+
0.0004238636000081897,
|
| 1134 |
+
0.0003507738478947431,
|
| 1135 |
+
0.0003962021629558876,
|
| 1136 |
+
0.00037149047187995165,
|
| 1137 |
+
0.00046468668995657935,
|
| 1138 |
+
0.00032575733319390566,
|
| 1139 |
+
0.00024963457719422876,
|
| 1140 |
+
0.0003067754174116999,
|
| 1141 |
+
0.00035329042904777455,
|
| 1142 |
+
0.0005153995635919273,
|
| 1143 |
+
0.0003246889301226474,
|
| 1144 |
+
0.0003684318211162463,
|
| 1145 |
+
0.0004444846746628173,
|
| 1146 |
+
0.00041379352449439467,
|
| 1147 |
+
0.0003515669924672693,
|
| 1148 |
+
0.00038759303221013396,
|
| 1149 |
+
0.0004435857862699777,
|
| 1150 |
+
0.0004995982383843512,
|
| 1151 |
+
0.0003287370689213276,
|
| 1152 |
+
0.00047642024292144925,
|
| 1153 |
+
0.00028691707353573293,
|
| 1154 |
+
0.0003483933850657195,
|
| 1155 |
+
0.0003521939943311736,
|
| 1156 |
+
0.00036151885491563006,
|
| 1157 |
+
0.0004122573434142396,
|
| 1158 |
+
0.00035326316545251755,
|
| 1159 |
+
0.0004000843345420435,
|
| 1160 |
+
0.000400399878853932,
|
| 1161 |
+
0.00045455951592884955,
|
| 1162 |
+
0.00050500372890383,
|
| 1163 |
+
0.0003862933220807463,
|
| 1164 |
+
0.00040961132501252,
|
| 1165 |
+
0.0003414417378371581,
|
| 1166 |
+
0.0002654360927408561,
|
| 1167 |
+
0.0004154561273753643,
|
| 1168 |
+
0.0004236356131150387,
|
| 1169 |
+
0.00029919785971287637,
|
| 1170 |
+
0.0004272714161197655,
|
| 1171 |
+
0.0004150178382406011,
|
| 1172 |
+
0.0003229470926453359,
|
| 1173 |
+
0.0003658977238228545,
|
| 1174 |
+
0.00034324454085435716,
|
| 1175 |
+
0.0004518656528671272,
|
| 1176 |
+
0.0003687477015773766,
|
| 1177 |
+
0.0003261093501350842,
|
| 1178 |
+
0.00032564327761065216,
|
| 1179 |
+
0.0002933415799634531,
|
| 1180 |
+
0.000270336834655609,
|
| 1181 |
+
0.000387968776631169,
|
| 1182 |
+
0.0003666125950985588,
|
| 1183 |
+
0.0003636504599126056,
|
| 1184 |
+
0.0004127571132266894,
|
| 1185 |
+
0.00030336800264194606,
|
| 1186 |
+
0.0003643505886429921,
|
| 1187 |
+
0.0003267512770253234,
|
| 1188 |
+
0.0003197753941640258,
|
| 1189 |
+
0.00037805401429068296,
|
| 1190 |
+
0.00038405115046771245,
|
| 1191 |
+
0.00035283432574942706,
|
| 1192 |
+
0.00041506045556161553,
|
| 1193 |
+
0.00036260715685784814,
|
| 1194 |
+
0.00030991145758889614,
|
| 1195 |
+
0.0003659922437509522,
|
| 1196 |
+
0.00027322167210513726,
|
| 1197 |
+
0.00038372074777726084,
|
| 1198 |
+
0.0004148900162545033,
|
| 1199 |
+
0.00042836347536649553,
|
| 1200 |
+
0.00037877143884543327,
|
| 1201 |
+
0.0005061252077575773,
|
| 1202 |
+
0.00035948755539720877,
|
| 1203 |
+
0.0003771417384268716,
|
| 1204 |
+
0.00039030959655065087,
|
| 1205 |
+
0.00038789588259533045,
|
| 1206 |
+
0.00044667536625638606,
|
| 1207 |
+
0.0004940482729580253,
|
| 1208 |
+
0.0003876484592910856,
|
| 1209 |
+
0.000496784629940521,
|
| 1210 |
+
0.0003630626905942336,
|
| 1211 |
+
0.0003826658398611471,
|
| 1212 |
+
0.0004710066289408132,
|
| 1213 |
+
0.00040921977342804893,
|
| 1214 |
+
0.0003235640816274099,
|
| 1215 |
+
0.0003243372563156299,
|
| 1216 |
+
0.00043588502041529863,
|
| 1217 |
+
0.0003470188335631974,
|
| 1218 |
+
0.00034793666331097486,
|
| 1219 |
+
0.0003599141997983679,
|
| 1220 |
+
0.0004154544865014032,
|
| 1221 |
+
0.0003486787172732875,
|
| 1222 |
+
0.0003607450271374546,
|
| 1223 |
+
0.00037457575352163987,
|
| 1224 |
+
0.00041867122112307695,
|
| 1225 |
+
0.000329351706604939,
|
| 1226 |
+
0.00035122557223076,
|
| 1227 |
+
0.00041060305142309516,
|
| 1228 |
+
0.0003799958195304498,
|
| 1229 |
+
0.00036206531309289857,
|
| 1230 |
+
0.00044233362597879023,
|
| 1231 |
+
0.0003119029168738052,
|
| 1232 |
+
0.00039872028079116715
|
| 1233 |
+
],
|
| 1234 |
+
"train_physics_loss": [
|
| 1235 |
+
0.017416297048330306,
|
| 1236 |
+
0.007131594605743885,
|
| 1237 |
+
0.005382413696497679,
|
| 1238 |
+
0.004962516687810421,
|
| 1239 |
+
0.004615665636956692,
|
| 1240 |
+
0.00472789304330945,
|
| 1241 |
+
0.004539205282926559,
|
| 1242 |
+
0.004223079662770033,
|
| 1243 |
+
0.004256722638383508,
|
| 1244 |
+
0.004328010128811002,
|
| 1245 |
+
0.0038817600812762975,
|
| 1246 |
+
0.0034686099272221325,
|
| 1247 |
+
0.003320377906784415,
|
| 1248 |
+
0.003289809273555875,
|
| 1249 |
+
0.0032469066232442855,
|
| 1250 |
+
0.0032116375397890804,
|
| 1251 |
+
0.0031851226557046176,
|
| 1252 |
+
0.0031901280395686626,
|
| 1253 |
+
0.003183133350685239,
|
| 1254 |
+
0.0031820846255868674,
|
| 1255 |
+
0.003165650758892298,
|
| 1256 |
+
0.0031518572568893432,
|
| 1257 |
+
0.0031282765232026576,
|
| 1258 |
+
0.0031360022444278,
|
| 1259 |
+
0.0031284498888999226,
|
| 1260 |
+
0.0031564371660351754,
|
| 1261 |
+
0.003112083580344915,
|
| 1262 |
+
0.0031356660183519125,
|
| 1263 |
+
0.0031273475289344786,
|
| 1264 |
+
0.003098822357133031,
|
| 1265 |
+
0.0031254397332668303,
|
| 1266 |
+
0.0031081771105527876,
|
| 1267 |
+
0.0030882700346410276,
|
| 1268 |
+
0.0031194701325148342,
|
| 1269 |
+
0.0031075748056173325,
|
| 1270 |
+
0.003087041238322854,
|
| 1271 |
+
0.003083778377622366,
|
| 1272 |
+
0.0030826180800795555,
|
| 1273 |
+
0.003090157005935907,
|
| 1274 |
+
0.0030648533534258604,
|
| 1275 |
+
0.0030658565741032364,
|
| 1276 |
+
0.0030823756288737058,
|
| 1277 |
+
0.00306918784044683,
|
| 1278 |
+
0.003061797972768545,
|
| 1279 |
+
0.0030520260520279408,
|
| 1280 |
+
0.0030583441257476807,
|
| 1281 |
+
0.003126722862944007,
|
| 1282 |
+
0.0030534956604242325,
|
| 1283 |
+
0.0030110508762300012,
|
| 1284 |
+
0.0030587144382297994,
|
| 1285 |
+
0.00305574256926775,
|
| 1286 |
+
0.0030395802296698093,
|
| 1287 |
+
0.0030413248296827077,
|
| 1288 |
+
0.003057920876890421,
|
| 1289 |
+
0.0030411494337022303,
|
| 1290 |
+
0.0030521050561219455,
|
| 1291 |
+
0.0030196056235581637,
|
| 1292 |
+
0.003084487235173583,
|
| 1293 |
+
0.003022526241838932,
|
| 1294 |
+
0.0030564092472195625,
|
| 1295 |
+
0.0030260732863098385,
|
| 1296 |
+
0.003036827277392149,
|
| 1297 |
+
0.003046699520200491,
|
| 1298 |
+
0.003032410992309451,
|
| 1299 |
+
0.0030305781494826076,
|
| 1300 |
+
0.003016272634267807,
|
| 1301 |
+
0.003024119082838297,
|
| 1302 |
+
0.003025384871289134,
|
| 1303 |
+
0.0030228069424629214,
|
| 1304 |
+
0.003031172901391983,
|
| 1305 |
+
0.003029827391728759,
|
| 1306 |
+
0.003035636655986309,
|
| 1307 |
+
0.0030202470440417527,
|
| 1308 |
+
0.0030380630400031806,
|
| 1309 |
+
0.003007029639557004,
|
| 1310 |
+
0.0030190805066376922,
|
| 1311 |
+
0.0030335069634020327,
|
| 1312 |
+
0.003026323560625315,
|
| 1313 |
+
0.003000856013968587,
|
| 1314 |
+
0.0030281548108905556,
|
| 1315 |
+
0.0030085602961480618,
|
| 1316 |
+
0.003007415719330311,
|
| 1317 |
+
0.0030112454667687418,
|
| 1318 |
+
0.0030219836719334123,
|
| 1319 |
+
0.0029884873982518913,
|
| 1320 |
+
0.0030353478342294692,
|
| 1321 |
+
0.0030422700103372334,
|
| 1322 |
+
0.0030089453887194393,
|
| 1323 |
+
0.002996978936716914,
|
| 1324 |
+
0.0030162672512233256,
|
| 1325 |
+
0.003013855963945389,
|
| 1326 |
+
0.0030150776356458664,
|
| 1327 |
+
0.0030336463265120983,
|
| 1328 |
+
0.003015637919306755,
|
| 1329 |
+
0.003029275005683303,
|
| 1330 |
+
0.003010910488665104,
|
| 1331 |
+
0.0030042449943721294,
|
| 1332 |
+
0.00302448402158916,
|
| 1333 |
+
0.0030001820344477893,
|
| 1334 |
+
0.003029604321345687,
|
| 1335 |
+
0.0030254157911986115,
|
| 1336 |
+
0.002993084192276001,
|
| 1337 |
+
0.003039376614615321,
|
| 1338 |
+
0.0029988550674170254,
|
| 1339 |
+
0.003045919882133603,
|
| 1340 |
+
0.002979724295437336,
|
| 1341 |
+
0.0029988096561282874,
|
| 1342 |
+
0.0029999073036015033,
|
| 1343 |
+
0.003021621545776725,
|
| 1344 |
+
0.003022714126855135,
|
| 1345 |
+
0.0029858863167464733,
|
| 1346 |
+
0.0030100739188492296,
|
| 1347 |
+
0.0030285364482551813,
|
| 1348 |
+
0.0030037328973412515,
|
| 1349 |
+
0.002995910719037056,
|
| 1350 |
+
0.003009927375242114,
|
| 1351 |
+
0.0030073073692619802,
|
| 1352 |
+
0.003010611403733492,
|
| 1353 |
+
0.0030038072913885117,
|
| 1354 |
+
0.0030182112101465462,
|
| 1355 |
+
0.002995218504220247,
|
| 1356 |
+
0.002983824983239174,
|
| 1357 |
+
0.002982194172218442,
|
| 1358 |
+
0.0030404560081660746,
|
| 1359 |
+
0.0030072324350476267,
|
| 1360 |
+
0.0030283622723072767,
|
| 1361 |
+
0.0029891838133335115,
|
| 1362 |
+
0.0030143480468541382,
|
| 1363 |
+
0.002999749705195427,
|
| 1364 |
+
0.003000131780281663,
|
| 1365 |
+
0.003022940456867218,
|
| 1366 |
+
0.003025440014898777,
|
| 1367 |
+
0.0030116031784564257,
|
| 1368 |
+
0.002986240452155471,
|
| 1369 |
+
0.003013824885711074,
|
| 1370 |
+
0.003004098106175661,
|
| 1371 |
+
0.00302089836448431,
|
| 1372 |
+
0.0030049107037484644,
|
| 1373 |
+
0.002995796613395214,
|
| 1374 |
+
0.0030257892142981292,
|
| 1375 |
+
0.0030044620856642724,
|
| 1376 |
+
0.0030031234864145518,
|
| 1377 |
+
0.00300865788012743,
|
| 1378 |
+
0.0029911322519183157,
|
| 1379 |
+
0.003006950691342354,
|
| 1380 |
+
0.0030100391805171966,
|
| 1381 |
+
0.002990981163457036,
|
| 1382 |
+
0.003003418045118451,
|
| 1383 |
+
0.003008406478911638,
|
| 1384 |
+
0.0030058973468840124,
|
| 1385 |
+
0.002988080643117428,
|
| 1386 |
+
0.0030258931685239075,
|
| 1387 |
+
0.0030068624671548607,
|
| 1388 |
+
0.0030056639108806847,
|
| 1389 |
+
0.002992995185777545,
|
| 1390 |
+
0.0029999123141169546,
|
| 1391 |
+
0.0030087356735020877,
|
| 1392 |
+
0.002990653198212385,
|
| 1393 |
+
0.0030021975189447403,
|
| 1394 |
+
0.0029972524475306273,
|
| 1395 |
+
0.002995085110887885,
|
| 1396 |
+
0.0030054925847798585,
|
| 1397 |
+
0.002997946050018072,
|
| 1398 |
+
0.003001699475571513,
|
| 1399 |
+
0.002991475472226739,
|
| 1400 |
+
0.0030039676278829576,
|
| 1401 |
+
0.002987636383622885,
|
| 1402 |
+
0.0030141256283968687,
|
| 1403 |
+
0.0030102021712809803,
|
| 1404 |
+
0.0029978158324956896,
|
| 1405 |
+
0.0030138571839779613,
|
| 1406 |
+
0.0029932350851595403,
|
| 1407 |
+
0.00300808347761631,
|
| 1408 |
+
0.0029976877849549055,
|
| 1409 |
+
0.0029958945233374834,
|
| 1410 |
+
0.0030053183156996967,
|
| 1411 |
+
0.0030049106013029813,
|
| 1412 |
+
0.0030000553000718356,
|
| 1413 |
+
0.002988978885114193,
|
| 1414 |
+
0.003018194520846009,
|
| 1415 |
+
0.0029952624347060917,
|
| 1416 |
+
0.002999657429754734,
|
| 1417 |
+
0.002996707018464804,
|
| 1418 |
+
0.002994512477889657,
|
| 1419 |
+
0.0030053806211799384,
|
| 1420 |
+
0.0030018383730202915,
|
| 1421 |
+
0.003013523109257221,
|
| 1422 |
+
0.0030000174045562746,
|
| 1423 |
+
0.002989927912130952,
|
| 1424 |
+
0.003011202123016119,
|
| 1425 |
+
0.0030075929407030345,
|
| 1426 |
+
0.002994441781193018,
|
| 1427 |
+
0.0030066764354705813,
|
| 1428 |
+
0.0029957740847021343,
|
| 1429 |
+
0.0030123892612755297,
|
| 1430 |
+
0.0030025244038552044,
|
| 1431 |
+
0.0029837236553430556,
|
| 1432 |
+
0.003010880583897233,
|
| 1433 |
+
0.0030115751177072526,
|
| 1434 |
+
0.003002043645828962,
|
| 1435 |
+
0.0030194725282490253,
|
| 1436 |
+
0.0029909814428538085,
|
| 1437 |
+
0.0030037058517336846,
|
| 1438 |
+
0.0030161426309496166,
|
| 1439 |
+
0.002999173793941736,
|
| 1440 |
+
0.002985935164615512,
|
| 1441 |
+
0.0029986063670367,
|
| 1442 |
+
0.003008823273703456,
|
| 1443 |
+
0.0030103589221835135,
|
| 1444 |
+
0.003013007501140237,
|
| 1445 |
+
0.003009993601590395,
|
| 1446 |
+
0.002993764383718371,
|
| 1447 |
+
0.00300774397328496,
|
| 1448 |
+
0.0029944219440221787,
|
| 1449 |
+
0.0030031389463692905,
|
| 1450 |
+
0.0030130419693887235,
|
| 1451 |
+
0.002991698393598199,
|
| 1452 |
+
0.003003444392234087,
|
| 1453 |
+
0.002992144227027893,
|
| 1454 |
+
0.0029979108832776545,
|
| 1455 |
+
0.002996647944673896,
|
| 1456 |
+
0.0030131647642701862,
|
| 1457 |
+
0.0030058703292161226,
|
| 1458 |
+
0.002995017264038324,
|
| 1459 |
+
0.0030132956709712742,
|
| 1460 |
+
0.0030020245909690856,
|
| 1461 |
+
0.002992521747946739,
|
| 1462 |
+
0.0030071669910103085,
|
| 1463 |
+
0.002994024157524109,
|
| 1464 |
+
0.0030002053175121547,
|
| 1465 |
+
0.002997279688715935,
|
| 1466 |
+
0.003002058556303382,
|
| 1467 |
+
0.0029820558801293374,
|
| 1468 |
+
0.0030140189733356237,
|
| 1469 |
+
0.0029819879308342935,
|
| 1470 |
+
0.003001656923443079,
|
| 1471 |
+
0.0030039852764457463,
|
| 1472 |
+
0.0030141368228942156,
|
| 1473 |
+
0.0029897452518343926,
|
| 1474 |
+
0.0030207333341240885,
|
| 1475 |
+
0.0029882837645709513,
|
| 1476 |
+
0.0030088076274842022,
|
| 1477 |
+
0.003011985719203949,
|
| 1478 |
+
0.0029927622620016334,
|
| 1479 |
+
0.0030071061942726373,
|
| 1480 |
+
0.0030098615400493147,
|
| 1481 |
+
0.0029954583570361137,
|
| 1482 |
+
0.003024046355858445,
|
| 1483 |
+
0.0029990424122661352,
|
| 1484 |
+
0.0030070852767676115,
|
| 1485 |
+
0.0030079849250614645,
|
| 1486 |
+
0.002990520391613245,
|
| 1487 |
+
0.003012172058224678,
|
| 1488 |
+
0.00300099266692996,
|
| 1489 |
+
0.003004054306074977,
|
| 1490 |
+
0.0029931909963488578,
|
| 1491 |
+
0.0029957049898803236,
|
| 1492 |
+
0.003015354694798589,
|
| 1493 |
+
0.003006587466225028,
|
| 1494 |
+
0.00299726964905858,
|
| 1495 |
+
0.0029945882875472307,
|
| 1496 |
+
0.003001625528559089,
|
| 1497 |
+
0.003015859238803387,
|
| 1498 |
+
0.0030122727435082197,
|
| 1499 |
+
0.003000014405697584,
|
| 1500 |
+
0.0029975782707333566,
|
| 1501 |
+
0.003004833785817027,
|
| 1502 |
+
0.0030074812564998867,
|
| 1503 |
+
0.0029995064157992603,
|
| 1504 |
+
0.002988514434546232,
|
| 1505 |
+
0.0030082684941589834,
|
| 1506 |
+
0.003019244512543082,
|
| 1507 |
+
0.0029995765071362257,
|
| 1508 |
+
0.0029855862725526095,
|
| 1509 |
+
0.003013227991759777,
|
| 1510 |
+
0.0029921956453472375,
|
| 1511 |
+
0.00301087262108922,
|
| 1512 |
+
0.003013811595737934,
|
| 1513 |
+
0.0029884382616728543,
|
| 1514 |
+
0.003002520548179746,
|
| 1515 |
+
0.003002414321526885,
|
| 1516 |
+
0.003005644930526614,
|
| 1517 |
+
0.0030072263814508917,
|
| 1518 |
+
0.002993305195122957,
|
| 1519 |
+
0.0030125955305993555,
|
| 1520 |
+
0.0030041873548179864,
|
| 1521 |
+
0.0029903002455830574,
|
| 1522 |
+
0.0029968177061527965,
|
| 1523 |
+
0.003007328873500228,
|
| 1524 |
+
0.0029975753556936977,
|
| 1525 |
+
0.003009707406163216,
|
| 1526 |
+
0.0029934804793447256,
|
| 1527 |
+
0.0030083892960101367,
|
| 1528 |
+
0.0030017207656055688,
|
| 1529 |
+
0.002995847277343273,
|
| 1530 |
+
0.0030091470666229725,
|
| 1531 |
+
0.003002710286527872,
|
| 1532 |
+
0.0030107030645012854,
|
| 1533 |
+
0.003009835230186582,
|
| 1534 |
+
0.0029894761461764572,
|
| 1535 |
+
0.0029882557597011327,
|
| 1536 |
+
0.0030101187154650687,
|
| 1537 |
+
0.0030047460459172726,
|
| 1538 |
+
0.003006603643298149,
|
| 1539 |
+
0.0029985809698700903,
|
| 1540 |
+
0.0029999753553420303,
|
| 1541 |
+
0.002998274341225624,
|
| 1542 |
+
0.003010425502434373,
|
| 1543 |
+
0.002994422549381852,
|
| 1544 |
+
0.003007517298683524,
|
| 1545 |
+
0.0030048126354813576,
|
| 1546 |
+
0.002996899299323559,
|
| 1547 |
+
0.003007491761818528,
|
| 1548 |
+
0.003005539756268263,
|
| 1549 |
+
0.0030135696940124033,
|
| 1550 |
+
0.002997032729908824,
|
| 1551 |
+
0.002989833327010274,
|
| 1552 |
+
0.0029904266446828844,
|
| 1553 |
+
0.003013614621013403,
|
| 1554 |
+
0.0029990132432430984,
|
| 1555 |
+
0.003017691373825073,
|
| 1556 |
+
0.0030073697585612535,
|
| 1557 |
+
0.0029920847062021496,
|
| 1558 |
+
0.0030065272469073532,
|
| 1559 |
+
0.0030173000413924456,
|
| 1560 |
+
0.003007012689486146,
|
| 1561 |
+
0.002982444940134883,
|
| 1562 |
+
0.0030011752899736168,
|
| 1563 |
+
0.0030100908223539592,
|
| 1564 |
+
0.003008154286071658,
|
| 1565 |
+
0.003012208789587021,
|
| 1566 |
+
0.0030078441463410855,
|
| 1567 |
+
0.0029961461946368217,
|
| 1568 |
+
0.0029932641237974166,
|
| 1569 |
+
0.003014317499473691,
|
| 1570 |
+
0.0030071069672703743,
|
| 1571 |
+
0.0030032399855554103,
|
| 1572 |
+
0.0030113482289016247,
|
| 1573 |
+
0.002999826110899448,
|
| 1574 |
+
0.00298927896656096,
|
| 1575 |
+
0.003014590796083212,
|
| 1576 |
+
0.003004206595942378,
|
| 1577 |
+
0.0030044769868254662,
|
| 1578 |
+
0.0030059030558913947,
|
| 1579 |
+
0.002998601896688342,
|
| 1580 |
+
0.003013147823512554,
|
| 1581 |
+
0.003006438361480832,
|
| 1582 |
+
0.0029977593012154103,
|
| 1583 |
+
0.0030116643477231266,
|
| 1584 |
+
0.0030077799037098885,
|
| 1585 |
+
0.003014070875942707,
|
| 1586 |
+
0.0030064770951867105,
|
| 1587 |
+
0.00299388044513762,
|
| 1588 |
+
0.0029888609424233437,
|
| 1589 |
+
0.003004243541508913,
|
| 1590 |
+
0.003001367850229144,
|
| 1591 |
+
0.0029946482833474873,
|
| 1592 |
+
0.0029978380631655453,
|
| 1593 |
+
0.0029964977130293845,
|
| 1594 |
+
0.0030071150232106447,
|
| 1595 |
+
0.002994759660214186,
|
| 1596 |
+
0.003014381527900696,
|
| 1597 |
+
0.00300535392947495,
|
| 1598 |
+
0.003016221933066845,
|
| 1599 |
+
0.003002598928287625,
|
| 1600 |
+
0.003007428189739585,
|
| 1601 |
+
0.0030110098235309126,
|
| 1602 |
+
0.003003385588526726,
|
| 1603 |
+
0.0030025608371943235,
|
| 1604 |
+
0.0030146711785346268,
|
| 1605 |
+
0.002997990632429719,
|
| 1606 |
+
0.003008288126438856,
|
| 1607 |
+
0.0030054442305117845,
|
| 1608 |
+
0.0030082505848258736,
|
| 1609 |
+
0.00300227427855134,
|
| 1610 |
+
0.0029844361636787655,
|
| 1611 |
+
0.0030097624473273753,
|
| 1612 |
+
0.003012060420587659,
|
| 1613 |
+
0.0029899550601840017,
|
| 1614 |
+
0.0029927427042275665,
|
| 1615 |
+
0.003013359569013119,
|
| 1616 |
+
0.00300162298604846,
|
| 1617 |
+
0.002986516896635294,
|
| 1618 |
+
0.0029961772169917822,
|
| 1619 |
+
0.0030217792373150587,
|
| 1620 |
+
0.0029937389306724073,
|
| 1621 |
+
0.0030095004476606846,
|
| 1622 |
+
0.0029998531099408865,
|
| 1623 |
+
0.0030131762009114028,
|
| 1624 |
+
0.0030067174136638643,
|
| 1625 |
+
0.003011715589091182,
|
| 1626 |
+
0.0030041988380253314,
|
| 1627 |
+
0.002996876519173384,
|
| 1628 |
+
0.0029971723817288874,
|
| 1629 |
+
0.003001107610762119,
|
| 1630 |
+
0.003001211220398545,
|
| 1631 |
+
0.0030151842907071113,
|
| 1632 |
+
0.0029931504651904105,
|
| 1633 |
+
0.0030120210070163013,
|
| 1634 |
+
0.0030065411608666184,
|
| 1635 |
+
0.003008819175884128,
|
| 1636 |
+
0.0029951191507279875
|
| 1637 |
+
],
|
| 1638 |
+
"lr": [
|
| 1639 |
+
0.001,
|
| 1640 |
+
0.001,
|
| 1641 |
+
0.001,
|
| 1642 |
+
0.001,
|
| 1643 |
+
0.001,
|
| 1644 |
+
0.001,
|
| 1645 |
+
0.001,
|
| 1646 |
+
0.001,
|
| 1647 |
+
0.001,
|
| 1648 |
+
0.001,
|
| 1649 |
+
0.001,
|
| 1650 |
+
0.001,
|
| 1651 |
+
0.001,
|
| 1652 |
+
0.001,
|
| 1653 |
+
0.001,
|
| 1654 |
+
0.001,
|
| 1655 |
+
0.001,
|
| 1656 |
+
0.001,
|
| 1657 |
+
0.001,
|
| 1658 |
+
0.001,
|
| 1659 |
+
0.001,
|
| 1660 |
+
0.001,
|
| 1661 |
+
0.001,
|
| 1662 |
+
0.001,
|
| 1663 |
+
0.001,
|
| 1664 |
+
0.001,
|
| 1665 |
+
0.001,
|
| 1666 |
+
0.001,
|
| 1667 |
+
0.001,
|
| 1668 |
+
0.001,
|
| 1669 |
+
0.001,
|
| 1670 |
+
0.001,
|
| 1671 |
+
0.001,
|
| 1672 |
+
0.001,
|
| 1673 |
+
0.001,
|
| 1674 |
+
0.001,
|
| 1675 |
+
0.001,
|
| 1676 |
+
0.001,
|
| 1677 |
+
0.001,
|
| 1678 |
+
0.001,
|
| 1679 |
+
0.001,
|
| 1680 |
+
0.001,
|
| 1681 |
+
0.001,
|
| 1682 |
+
0.001,
|
| 1683 |
+
0.001,
|
| 1684 |
+
0.001,
|
| 1685 |
+
0.001,
|
| 1686 |
+
0.001,
|
| 1687 |
+
0.001,
|
| 1688 |
+
0.001,
|
| 1689 |
+
0.001,
|
| 1690 |
+
0.001,
|
| 1691 |
+
0.001,
|
| 1692 |
+
0.001,
|
| 1693 |
+
0.001,
|
| 1694 |
+
0.001,
|
| 1695 |
+
0.001,
|
| 1696 |
+
0.001,
|
| 1697 |
+
0.001,
|
| 1698 |
+
0.001,
|
| 1699 |
+
0.001,
|
| 1700 |
+
0.001,
|
| 1701 |
+
0.001,
|
| 1702 |
+
0.001,
|
| 1703 |
+
0.001,
|
| 1704 |
+
0.001,
|
| 1705 |
+
0.001,
|
| 1706 |
+
0.001,
|
| 1707 |
+
0.001,
|
| 1708 |
+
0.001,
|
| 1709 |
+
0.001,
|
| 1710 |
+
0.001,
|
| 1711 |
+
0.001,
|
| 1712 |
+
0.001,
|
| 1713 |
+
0.001,
|
| 1714 |
+
0.001,
|
| 1715 |
+
0.001,
|
| 1716 |
+
0.001,
|
| 1717 |
+
0.001,
|
| 1718 |
+
0.001,
|
| 1719 |
+
0.001,
|
| 1720 |
+
0.001,
|
| 1721 |
+
0.001,
|
| 1722 |
+
0.001,
|
| 1723 |
+
0.001,
|
| 1724 |
+
0.001,
|
| 1725 |
+
0.001,
|
| 1726 |
+
0.001,
|
| 1727 |
+
0.001,
|
| 1728 |
+
0.001,
|
| 1729 |
+
0.001,
|
| 1730 |
+
0.001,
|
| 1731 |
+
0.001,
|
| 1732 |
+
0.001,
|
| 1733 |
+
0.001,
|
| 1734 |
+
0.001,
|
| 1735 |
+
0.001,
|
| 1736 |
+
0.001,
|
| 1737 |
+
0.001,
|
| 1738 |
+
0.001,
|
| 1739 |
+
0.001,
|
| 1740 |
+
0.001,
|
| 1741 |
+
0.001,
|
| 1742 |
+
0.001,
|
| 1743 |
+
0.001,
|
| 1744 |
+
0.001,
|
| 1745 |
+
0.0005,
|
| 1746 |
+
0.0005,
|
| 1747 |
+
0.0005,
|
| 1748 |
+
0.0005,
|
| 1749 |
+
0.0005,
|
| 1750 |
+
0.0005,
|
| 1751 |
+
0.0005,
|
| 1752 |
+
0.0005,
|
| 1753 |
+
0.0005,
|
| 1754 |
+
0.0005,
|
| 1755 |
+
0.0005,
|
| 1756 |
+
0.0005,
|
| 1757 |
+
0.0005,
|
| 1758 |
+
0.0005,
|
| 1759 |
+
0.0005,
|
| 1760 |
+
0.0005,
|
| 1761 |
+
0.0005,
|
| 1762 |
+
0.0005,
|
| 1763 |
+
0.0005,
|
| 1764 |
+
0.0005,
|
| 1765 |
+
0.0005,
|
| 1766 |
+
0.0005,
|
| 1767 |
+
0.0005,
|
| 1768 |
+
0.0005,
|
| 1769 |
+
0.0005,
|
| 1770 |
+
0.00025,
|
| 1771 |
+
0.00025,
|
| 1772 |
+
0.00025,
|
| 1773 |
+
0.00025,
|
| 1774 |
+
0.00025,
|
| 1775 |
+
0.00025,
|
| 1776 |
+
0.00025,
|
| 1777 |
+
0.00025,
|
| 1778 |
+
0.00025,
|
| 1779 |
+
0.00025,
|
| 1780 |
+
0.00025,
|
| 1781 |
+
0.00025,
|
| 1782 |
+
0.00025,
|
| 1783 |
+
0.00025,
|
| 1784 |
+
0.00025,
|
| 1785 |
+
0.00025,
|
| 1786 |
+
0.00025,
|
| 1787 |
+
0.00025,
|
| 1788 |
+
0.00025,
|
| 1789 |
+
0.00025,
|
| 1790 |
+
0.00025,
|
| 1791 |
+
0.000125,
|
| 1792 |
+
0.000125,
|
| 1793 |
+
0.000125,
|
| 1794 |
+
0.000125,
|
| 1795 |
+
0.000125,
|
| 1796 |
+
0.000125,
|
| 1797 |
+
0.000125,
|
| 1798 |
+
0.000125,
|
| 1799 |
+
0.000125,
|
| 1800 |
+
0.000125,
|
| 1801 |
+
0.000125,
|
| 1802 |
+
0.000125,
|
| 1803 |
+
0.000125,
|
| 1804 |
+
0.000125,
|
| 1805 |
+
0.000125,
|
| 1806 |
+
0.000125,
|
| 1807 |
+
0.000125,
|
| 1808 |
+
0.000125,
|
| 1809 |
+
0.000125,
|
| 1810 |
+
0.000125,
|
| 1811 |
+
0.000125,
|
| 1812 |
+
0.000125,
|
| 1813 |
+
0.000125,
|
| 1814 |
+
0.000125,
|
| 1815 |
+
0.000125,
|
| 1816 |
+
0.000125,
|
| 1817 |
+
0.000125,
|
| 1818 |
+
0.000125,
|
| 1819 |
+
0.000125,
|
| 1820 |
+
0.000125,
|
| 1821 |
+
0.000125,
|
| 1822 |
+
0.000125,
|
| 1823 |
+
0.000125,
|
| 1824 |
+
0.000125,
|
| 1825 |
+
0.000125,
|
| 1826 |
+
0.000125,
|
| 1827 |
+
0.000125,
|
| 1828 |
+
0.000125,
|
| 1829 |
+
0.000125,
|
| 1830 |
+
0.000125,
|
| 1831 |
+
0.000125,
|
| 1832 |
+
0.000125,
|
| 1833 |
+
0.000125,
|
| 1834 |
+
0.000125,
|
| 1835 |
+
0.000125,
|
| 1836 |
+
0.000125,
|
| 1837 |
+
0.000125,
|
| 1838 |
+
0.000125,
|
| 1839 |
+
0.000125,
|
| 1840 |
+
0.000125,
|
| 1841 |
+
0.000125,
|
| 1842 |
+
0.000125,
|
| 1843 |
+
0.000125,
|
| 1844 |
+
0.000125,
|
| 1845 |
+
0.000125,
|
| 1846 |
+
0.000125,
|
| 1847 |
+
0.000125,
|
| 1848 |
+
0.000125,
|
| 1849 |
+
0.000125,
|
| 1850 |
+
0.000125,
|
| 1851 |
+
0.000125,
|
| 1852 |
+
0.000125,
|
| 1853 |
+
0.000125,
|
| 1854 |
+
0.000125,
|
| 1855 |
+
0.000125,
|
| 1856 |
+
0.000125,
|
| 1857 |
+
0.000125,
|
| 1858 |
+
0.000125,
|
| 1859 |
+
0.000125,
|
| 1860 |
+
0.000125,
|
| 1861 |
+
0.000125,
|
| 1862 |
+
0.000125,
|
| 1863 |
+
6.25e-05,
|
| 1864 |
+
6.25e-05,
|
| 1865 |
+
6.25e-05,
|
| 1866 |
+
6.25e-05,
|
| 1867 |
+
6.25e-05,
|
| 1868 |
+
6.25e-05,
|
| 1869 |
+
6.25e-05,
|
| 1870 |
+
6.25e-05,
|
| 1871 |
+
6.25e-05,
|
| 1872 |
+
6.25e-05,
|
| 1873 |
+
6.25e-05,
|
| 1874 |
+
6.25e-05,
|
| 1875 |
+
6.25e-05,
|
| 1876 |
+
6.25e-05,
|
| 1877 |
+
6.25e-05,
|
| 1878 |
+
6.25e-05,
|
| 1879 |
+
6.25e-05,
|
| 1880 |
+
6.25e-05,
|
| 1881 |
+
6.25e-05,
|
| 1882 |
+
6.25e-05,
|
| 1883 |
+
6.25e-05,
|
| 1884 |
+
6.25e-05,
|
| 1885 |
+
6.25e-05,
|
| 1886 |
+
6.25e-05,
|
| 1887 |
+
6.25e-05,
|
| 1888 |
+
6.25e-05,
|
| 1889 |
+
6.25e-05,
|
| 1890 |
+
6.25e-05,
|
| 1891 |
+
6.25e-05,
|
| 1892 |
+
6.25e-05,
|
| 1893 |
+
6.25e-05,
|
| 1894 |
+
6.25e-05,
|
| 1895 |
+
6.25e-05,
|
| 1896 |
+
6.25e-05,
|
| 1897 |
+
3.125e-05,
|
| 1898 |
+
3.125e-05,
|
| 1899 |
+
3.125e-05,
|
| 1900 |
+
3.125e-05,
|
| 1901 |
+
3.125e-05,
|
| 1902 |
+
3.125e-05,
|
| 1903 |
+
3.125e-05,
|
| 1904 |
+
3.125e-05,
|
| 1905 |
+
3.125e-05,
|
| 1906 |
+
3.125e-05,
|
| 1907 |
+
3.125e-05,
|
| 1908 |
+
3.125e-05,
|
| 1909 |
+
3.125e-05,
|
| 1910 |
+
3.125e-05,
|
| 1911 |
+
3.125e-05,
|
| 1912 |
+
3.125e-05,
|
| 1913 |
+
1.5625e-05,
|
| 1914 |
+
1.5625e-05,
|
| 1915 |
+
1.5625e-05,
|
| 1916 |
+
1.5625e-05,
|
| 1917 |
+
1.5625e-05,
|
| 1918 |
+
1.5625e-05,
|
| 1919 |
+
1.5625e-05,
|
| 1920 |
+
1.5625e-05,
|
| 1921 |
+
1.5625e-05,
|
| 1922 |
+
1.5625e-05,
|
| 1923 |
+
1.5625e-05,
|
| 1924 |
+
1.5625e-05,
|
| 1925 |
+
1.5625e-05,
|
| 1926 |
+
1.5625e-05,
|
| 1927 |
+
1.5625e-05,
|
| 1928 |
+
1.5625e-05,
|
| 1929 |
+
1.5625e-05,
|
| 1930 |
+
1.5625e-05,
|
| 1931 |
+
7.8125e-06,
|
| 1932 |
+
7.8125e-06,
|
| 1933 |
+
7.8125e-06,
|
| 1934 |
+
7.8125e-06,
|
| 1935 |
+
7.8125e-06,
|
| 1936 |
+
7.8125e-06,
|
| 1937 |
+
7.8125e-06,
|
| 1938 |
+
7.8125e-06,
|
| 1939 |
+
7.8125e-06,
|
| 1940 |
+
7.8125e-06,
|
| 1941 |
+
7.8125e-06,
|
| 1942 |
+
7.8125e-06,
|
| 1943 |
+
7.8125e-06,
|
| 1944 |
+
7.8125e-06,
|
| 1945 |
+
7.8125e-06,
|
| 1946 |
+
7.8125e-06,
|
| 1947 |
+
7.8125e-06,
|
| 1948 |
+
3.90625e-06,
|
| 1949 |
+
3.90625e-06,
|
| 1950 |
+
3.90625e-06,
|
| 1951 |
+
3.90625e-06,
|
| 1952 |
+
3.90625e-06,
|
| 1953 |
+
3.90625e-06,
|
| 1954 |
+
3.90625e-06,
|
| 1955 |
+
3.90625e-06,
|
| 1956 |
+
3.90625e-06,
|
| 1957 |
+
3.90625e-06,
|
| 1958 |
+
3.90625e-06,
|
| 1959 |
+
3.90625e-06,
|
| 1960 |
+
3.90625e-06,
|
| 1961 |
+
3.90625e-06,
|
| 1962 |
+
3.90625e-06,
|
| 1963 |
+
3.90625e-06,
|
| 1964 |
+
3.90625e-06,
|
| 1965 |
+
3.90625e-06,
|
| 1966 |
+
3.90625e-06,
|
| 1967 |
+
3.90625e-06,
|
| 1968 |
+
3.90625e-06,
|
| 1969 |
+
3.90625e-06,
|
| 1970 |
+
3.90625e-06,
|
| 1971 |
+
3.90625e-06,
|
| 1972 |
+
3.90625e-06,
|
| 1973 |
+
3.90625e-06,
|
| 1974 |
+
3.90625e-06,
|
| 1975 |
+
3.90625e-06,
|
| 1976 |
+
3.90625e-06,
|
| 1977 |
+
3.90625e-06,
|
| 1978 |
+
3.90625e-06,
|
| 1979 |
+
1.953125e-06,
|
| 1980 |
+
1.953125e-06,
|
| 1981 |
+
1.953125e-06,
|
| 1982 |
+
1.953125e-06,
|
| 1983 |
+
1.953125e-06,
|
| 1984 |
+
1.953125e-06,
|
| 1985 |
+
1.953125e-06,
|
| 1986 |
+
1.953125e-06,
|
| 1987 |
+
1.953125e-06,
|
| 1988 |
+
1.953125e-06,
|
| 1989 |
+
1.953125e-06,
|
| 1990 |
+
1.953125e-06,
|
| 1991 |
+
1.953125e-06,
|
| 1992 |
+
1.953125e-06,
|
| 1993 |
+
1.953125e-06,
|
| 1994 |
+
1.953125e-06,
|
| 1995 |
+
1.953125e-06,
|
| 1996 |
+
1.953125e-06,
|
| 1997 |
+
1.953125e-06,
|
| 1998 |
+
1.953125e-06,
|
| 1999 |
+
1.953125e-06,
|
| 2000 |
+
1.953125e-06,
|
| 2001 |
+
1.953125e-06,
|
| 2002 |
+
1.953125e-06,
|
| 2003 |
+
1.953125e-06,
|
| 2004 |
+
1.953125e-06,
|
| 2005 |
+
1.953125e-06,
|
| 2006 |
+
1.953125e-06,
|
| 2007 |
+
1e-06,
|
| 2008 |
+
1e-06,
|
| 2009 |
+
1e-06,
|
| 2010 |
+
1e-06,
|
| 2011 |
+
1e-06,
|
| 2012 |
+
1e-06,
|
| 2013 |
+
1e-06,
|
| 2014 |
+
1e-06,
|
| 2015 |
+
1e-06,
|
| 2016 |
+
1e-06,
|
| 2017 |
+
1e-06,
|
| 2018 |
+
1e-06,
|
| 2019 |
+
1e-06,
|
| 2020 |
+
1e-06,
|
| 2021 |
+
1e-06,
|
| 2022 |
+
1e-06,
|
| 2023 |
+
1e-06,
|
| 2024 |
+
1e-06,
|
| 2025 |
+
1e-06,
|
| 2026 |
+
1e-06,
|
| 2027 |
+
1e-06,
|
| 2028 |
+
1e-06,
|
| 2029 |
+
1e-06,
|
| 2030 |
+
1e-06,
|
| 2031 |
+
1e-06,
|
| 2032 |
+
1e-06,
|
| 2033 |
+
1e-06,
|
| 2034 |
+
1e-06,
|
| 2035 |
+
1e-06,
|
| 2036 |
+
1e-06,
|
| 2037 |
+
1e-06,
|
| 2038 |
+
1e-06,
|
| 2039 |
+
1e-06,
|
| 2040 |
+
1e-06
|
| 2041 |
+
]
|
| 2042 |
+
}
|
| 2043 |
+
}
|
validation_metrics_v1.1.json
ADDED
|
@@ -0,0 +1,52 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"n_test_samples": 100,
|
| 3 |
+
"temperature": {
|
| 4 |
+
"rmse_celsius": 3.3129,
|
| 5 |
+
"mae_celsius": 2.4032,
|
| 6 |
+
"max_error_celsius": 47.3145,
|
| 7 |
+
"r2": 0.993632,
|
| 8 |
+
"per_step_rmse": [
|
| 9 |
+
4.025,
|
| 10 |
+
3.2619,
|
| 11 |
+
3.3331,
|
| 12 |
+
3.0046,
|
| 13 |
+
2.8105
|
| 14 |
+
]
|
| 15 |
+
},
|
| 16 |
+
"pressure": {
|
| 17 |
+
"rmse_mpa": 0.3544,
|
| 18 |
+
"mae_mpa": 0.2559,
|
| 19 |
+
"max_error_mpa": 5.0404,
|
| 20 |
+
"r2": 0.996742,
|
| 21 |
+
"per_step_rmse": [
|
| 22 |
+
0.3555,
|
| 23 |
+
0.3642,
|
| 24 |
+
0.3263,
|
| 25 |
+
0.3322,
|
| 26 |
+
0.3903
|
| 27 |
+
]
|
| 28 |
+
},
|
| 29 |
+
"physics_violation_rate_pct": 0.0,
|
| 30 |
+
"inference": {
|
| 31 |
+
"mean_ms": 3.188,
|
| 32 |
+
"median_ms": 2.683,
|
| 33 |
+
"speedup_vs_simulator": 282350.0
|
| 34 |
+
},
|
| 35 |
+
"model": {
|
| 36 |
+
"n_params": 57802,
|
| 37 |
+
"best_epoch": 352,
|
| 38 |
+
"best_val_loss": 0.00038183890865184367
|
| 39 |
+
},
|
| 40 |
+
"thresholds": {
|
| 41 |
+
"rmse_temperature_target": 5.0,
|
| 42 |
+
"rmse_temperature_pass": true,
|
| 43 |
+
"r2_target": 0.95,
|
| 44 |
+
"r2_temperature_pass": true,
|
| 45 |
+
"r2_pressure_pass": true,
|
| 46 |
+
"inference_target_ms": 1000,
|
| 47 |
+
"inference_pass": true,
|
| 48 |
+
"speedup_target": 1000,
|
| 49 |
+
"speedup_pass": true
|
| 50 |
+
},
|
| 51 |
+
"date": "2026-03-30 17:31:03"
|
| 52 |
+
}
|