PotterWhite commited on
Commit
6fc4a55
·
1 Parent(s): 2889c12

feat: fine-tune MODNet-BN on photographic dataset (epochs 1-15)

Browse files

- Supervised fine-tuning of MODNet with batch normalization
- Trained for 15 epochs
- Saved per-epoch checkpoints + best model (modnet_bn_best.ckpt)
- Training log: block1_2_training_20260319_154018.log
- Validation visualization results saved for each epoch (output/epoch_*_val.png)
- Track *.png via Git LFS to comply with HuggingFace binary storage policy

Files changed (33) hide show
  1. .gitattributes +1 -0
  2. photographic/finetune/checkpoints/modnet_bn_best.ckpt +3 -0
  3. photographic/finetune/checkpoints/modnet_bn_epoch_01.ckpt +3 -0
  4. photographic/finetune/checkpoints/modnet_bn_epoch_02.ckpt +3 -0
  5. photographic/finetune/checkpoints/modnet_bn_epoch_03.ckpt +3 -0
  6. photographic/finetune/checkpoints/modnet_bn_epoch_04.ckpt +3 -0
  7. photographic/finetune/checkpoints/modnet_bn_epoch_05.ckpt +3 -0
  8. photographic/finetune/checkpoints/modnet_bn_epoch_06.ckpt +3 -0
  9. photographic/finetune/checkpoints/modnet_bn_epoch_07.ckpt +3 -0
  10. photographic/finetune/checkpoints/modnet_bn_epoch_08.ckpt +3 -0
  11. photographic/finetune/checkpoints/modnet_bn_epoch_09.ckpt +3 -0
  12. photographic/finetune/checkpoints/modnet_bn_epoch_10.ckpt +3 -0
  13. photographic/finetune/checkpoints/modnet_bn_epoch_11.ckpt +3 -0
  14. photographic/finetune/checkpoints/modnet_bn_epoch_12.ckpt +3 -0
  15. photographic/finetune/checkpoints/modnet_bn_epoch_13.ckpt +3 -0
  16. photographic/finetune/checkpoints/modnet_bn_epoch_14.ckpt +3 -0
  17. photographic/finetune/checkpoints/modnet_bn_epoch_15.ckpt +3 -0
  18. photographic/finetune/logs/block1_2_training_20260319_154018.log +501 -0
  19. photographic/finetune/output/epoch_01_val.png +3 -0
  20. photographic/finetune/output/epoch_02_val.png +3 -0
  21. photographic/finetune/output/epoch_03_val.png +3 -0
  22. photographic/finetune/output/epoch_04_val.png +3 -0
  23. photographic/finetune/output/epoch_05_val.png +3 -0
  24. photographic/finetune/output/epoch_06_val.png +3 -0
  25. photographic/finetune/output/epoch_07_val.png +3 -0
  26. photographic/finetune/output/epoch_08_val.png +3 -0
  27. photographic/finetune/output/epoch_09_val.png +3 -0
  28. photographic/finetune/output/epoch_10_val.png +3 -0
  29. photographic/finetune/output/epoch_11_val.png +3 -0
  30. photographic/finetune/output/epoch_12_val.png +3 -0
  31. photographic/finetune/output/epoch_13_val.png +3 -0
  32. photographic/finetune/output/epoch_14_val.png +3 -0
  33. photographic/finetune/output/epoch_15_val.png +3 -0
.gitattributes CHANGED
@@ -33,4 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
36
  *.rknn filter=lfs diff=lfs merge=lfs -text
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ *.png filter=lfs diff=lfs merge=lfs -text
37
  *.rknn filter=lfs diff=lfs merge=lfs -text
photographic/finetune/checkpoints/modnet_bn_best.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d13b1d56ab358a0262ff1b6e1d40da2a84c7d03aae32fb57b572cd7f952febff
3
+ size 26310946
photographic/finetune/checkpoints/modnet_bn_epoch_01.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3ac3a392dfdcc7d78654d6999862464b31c1e17620fa83204ded0562bbd5fac3
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_02.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ff6e24f4af31afe083ea7b239b9c5d5f05a95c9655de2294dd4ed107b1d030e4
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_03.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:29a6297390fc658097dd05ae49c55da1721d67a306e8dcf2cbd8351f759000da
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_04.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b41fb708e9e6af7a425779e4f73377d55325912247e5f0201d6bf5ce3cd0b036
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_05.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f2c948867548955de49fba5228207e596a9e62f6dd9485c57209d395fdf37b30
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_06.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0e85e352d8a40c3b5be67a954d5d64a13c3891d80b6f82d43e6141446293b30b
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_07.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:60d755aa2b7a5054e138e2053d21fea1bcdf7e9e7da2027593a99cffa2e771da
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_08.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f47c54080415a2fd8073224c6bf2776f404921ed43c4b2bad797df9ecb05b5e1
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_09.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6caee8ebaf7c84085dcb0c894ea4248d0c22fc4e3eb998c64ea563c872975dbd
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_10.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:261f738d8cd54437f2e22cf0262798d94e91b80dedf6abb03d398ce113e4ff7e
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_11.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0facecb7392023dce26905a6923eb5c1bfd5dab72c6714f82133e65f693a089b
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_12.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:34aae7d669d9fa04f1576e3d0d7a3536fd1d530967eb89a6be9c0431cd171316
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_13.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6c9fcffe65787a0abc8a92bef723c58a471683e3f193ab1877795b356a6e0eec
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_14.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d926f191354cecd46163084a8937d3d1c4c7119dc490cd870341e1cf28b2ce84
3
+ size 26312710
photographic/finetune/checkpoints/modnet_bn_epoch_15.ckpt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6b174ddd99a0ee46d2608b235eac93eee0d330297c2c46c7766dbdb94c080e8
3
+ size 26312710
photographic/finetune/logs/block1_2_training_20260319_154018.log ADDED
@@ -0,0 +1,501 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [Config] Device: cuda
2
+ [Config] Epochs: 15, BS: 8, LR: 0.01, Input: 512x512
3
+
4
+ [Phase 1.2] Initializing data pipeline...
5
+ [Dataset] Loaded 9421 samples from /development/docker_volumes/src/ai/image-matting/primary-folder/dataset/P3M-10k/train/blurred_image
6
+ [ValDataset] Loaded 500 validation samples
7
+
8
+ [Phase 1.2] Instantiating Pure-BN MODNet with pretrained backbone...
9
+ [Model] Total parameters: 6,487,795
10
+ [Model] Trainable parameters: 6,487,795
11
+
12
+ ============================================================
13
+ BLOCK 1.2: Fine-tune Training Started
14
+ 9421 train samples, 500 val samples
15
+ ============================================================
16
+
17
+ [Epoch 1/15] LR: 0.010000
18
+ /home/developer/image-matting/primary-folder/helmsman.git/.venv/lib/python3.8/site-packages/torch/nn/modules/conv.py:459: UserWarning: Applied workaround for CuDNN issue, install nvrtc.so (Triggered internally at ../aten/src/ATen/native/cudnn/Conv_v8.cpp:80.)
19
+ return F.conv2d(input, weight, bias, self.stride,
20
+ [Batch 0/1177] Loss: 4.7716 (Sem: 3.2526, Det: 0.5170, Mat: 1.0019)
21
+ [Batch 50/1177] Loss: 1.5606 (Sem: 0.8672, Det: 0.2922, Mat: 0.4012)
22
+ [Batch 100/1177] Loss: 0.8618 (Sem: 0.3915, Det: 0.2229, Mat: 0.2475)
23
+ [Batch 150/1177] Loss: 0.4850 (Sem: 0.1993, Det: 0.1355, Mat: 0.1502)
24
+ [Batch 200/1177] Loss: 0.5095 (Sem: 0.1943, Det: 0.1474, Mat: 0.1678)
25
+ [Batch 250/1177] Loss: 0.7951 (Sem: 0.3426, Det: 0.1971, Mat: 0.2553)
26
+ [Batch 300/1177] Loss: 0.5996 (Sem: 0.3256, Det: 0.1127, Mat: 0.1613)
27
+ [Batch 350/1177] Loss: 0.7075 (Sem: 0.3678, Det: 0.1569, Mat: 0.1827)
28
+ [Batch 400/1177] Loss: 0.2635 (Sem: 0.0839, Det: 0.0895, Mat: 0.0901)
29
+ [Batch 450/1177] Loss: 0.5693 (Sem: 0.2012, Det: 0.1843, Mat: 0.1838)
30
+ [Batch 500/1177] Loss: 0.4436 (Sem: 0.1420, Det: 0.1482, Mat: 0.1534)
31
+ [Batch 550/1177] Loss: 0.3895 (Sem: 0.1922, Det: 0.0822, Mat: 0.1151)
32
+ [Batch 600/1177] Loss: 0.7845 (Sem: 0.3778, Det: 0.2140, Mat: 0.1927)
33
+ [Batch 650/1177] Loss: 0.3549 (Sem: 0.1481, Det: 0.0939, Mat: 0.1129)
34
+ [Batch 700/1177] Loss: 0.5168 (Sem: 0.2789, Det: 0.0989, Mat: 0.1390)
35
+ [Batch 750/1177] Loss: 0.3253 (Sem: 0.1425, Det: 0.0830, Mat: 0.0999)
36
+ [Batch 800/1177] Loss: 0.4602 (Sem: 0.2363, Det: 0.0994, Mat: 0.1244)
37
+ [Batch 850/1177] Loss: 0.3592 (Sem: 0.1418, Det: 0.1047, Mat: 0.1127)
38
+ [Batch 900/1177] Loss: 0.5242 (Sem: 0.2652, Det: 0.1141, Mat: 0.1449)
39
+ [Batch 950/1177] Loss: 0.4759 (Sem: 0.2655, Det: 0.0892, Mat: 0.1212)
40
+ [Batch 1000/1177] Loss: 0.2526 (Sem: 0.0789, Det: 0.0892, Mat: 0.0845)
41
+ [Batch 1050/1177] Loss: 0.1583 (Sem: 0.0482, Det: 0.0600, Mat: 0.0501)
42
+ [Batch 1100/1177] Loss: 0.2473 (Sem: 0.0952, Det: 0.0762, Mat: 0.0759)
43
+ [Batch 1150/1177] Loss: 0.7236 (Sem: 0.3600, Det: 0.1492, Mat: 0.2143)
44
+ [Epoch 1 Summary] Avg Loss: 0.5410 (Sem: 0.2347, Det: 0.1418, Mat: 0.1645)
45
+ [Validation] Running on 500 samples...
46
+ [Viz] Saved validation preview -> ./output/epoch_01_val.png
47
+ [Validation] Avg L1 Loss: 0.0264
48
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_01.ckpt
49
+ [Best Model] New best! Val Loss: 0.0264 -> ./checkpoints/modnet_bn_best.ckpt
50
+
51
+ [Epoch 2/15] LR: 0.010000
52
+ [Batch 0/1177] Loss: 0.2944 (Sem: 0.0749, Det: 0.1179, Mat: 0.1015)
53
+ [Batch 50/1177] Loss: 0.2093 (Sem: 0.0791, Det: 0.0656, Mat: 0.0647)
54
+ [Batch 100/1177] Loss: 0.3175 (Sem: 0.1058, Det: 0.1109, Mat: 0.1008)
55
+ [Batch 150/1177] Loss: 0.7540 (Sem: 0.4597, Det: 0.1047, Mat: 0.1896)
56
+ [Batch 200/1177] Loss: 0.2263 (Sem: 0.0545, Det: 0.0908, Mat: 0.0810)
57
+ [Batch 250/1177] Loss: 0.3460 (Sem: 0.0918, Det: 0.1284, Mat: 0.1258)
58
+ [Batch 300/1177] Loss: 0.2823 (Sem: 0.1241, Det: 0.0682, Mat: 0.0900)
59
+ [Batch 350/1177] Loss: 0.3477 (Sem: 0.1194, Det: 0.1058, Mat: 0.1225)
60
+ [Batch 400/1177] Loss: 0.3467 (Sem: 0.1870, Det: 0.0708, Mat: 0.0890)
61
+ [Batch 450/1177] Loss: 0.2807 (Sem: 0.0717, Det: 0.1043, Mat: 0.1046)
62
+ [Batch 500/1177] Loss: 0.1940 (Sem: 0.0406, Det: 0.0849, Mat: 0.0684)
63
+ [Batch 550/1177] Loss: 0.1989 (Sem: 0.0672, Det: 0.0682, Mat: 0.0635)
64
+ [Batch 600/1177] Loss: 0.3122 (Sem: 0.1179, Det: 0.0993, Mat: 0.0951)
65
+ [Batch 650/1177] Loss: 0.2394 (Sem: 0.0630, Det: 0.0938, Mat: 0.0826)
66
+ [Batch 700/1177] Loss: 0.1376 (Sem: 0.0333, Det: 0.0558, Mat: 0.0485)
67
+ [Batch 750/1177] Loss: 0.4892 (Sem: 0.1848, Det: 0.1534, Mat: 0.1510)
68
+ [Batch 800/1177] Loss: 0.2804 (Sem: 0.0503, Det: 0.1240, Mat: 0.1061)
69
+ [Batch 850/1177] Loss: 0.3104 (Sem: 0.1559, Det: 0.0643, Mat: 0.0902)
70
+ [Batch 900/1177] Loss: 0.2301 (Sem: 0.0558, Det: 0.0924, Mat: 0.0818)
71
+ [Batch 950/1177] Loss: 0.2220 (Sem: 0.0669, Det: 0.0782, Mat: 0.0769)
72
+ [Batch 1000/1177] Loss: 0.2142 (Sem: 0.0996, Det: 0.0552, Mat: 0.0594)
73
+ [Batch 1050/1177] Loss: 0.1736 (Sem: 0.0447, Det: 0.0719, Mat: 0.0570)
74
+ [Batch 1100/1177] Loss: 0.2882 (Sem: 0.1540, Det: 0.0624, Mat: 0.0718)
75
+ [Batch 1150/1177] Loss: 0.2418 (Sem: 0.0783, Det: 0.0841, Mat: 0.0795)
76
+ [Epoch 2 Summary] Avg Loss: 0.3054 (Sem: 0.1215, Det: 0.0891, Mat: 0.0947)
77
+ [Validation] Running on 500 samples...
78
+ [Viz] Saved validation preview -> ./output/epoch_02_val.png
79
+ [Validation] Avg L1 Loss: 0.0175
80
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_02.ckpt
81
+ [Best Model] New best! Val Loss: 0.0175 -> ./checkpoints/modnet_bn_best.ckpt
82
+
83
+ [Epoch 3/15] LR: 0.010000
84
+ [Batch 0/1177] Loss: 0.6352 (Sem: 0.2609, Det: 0.1572, Mat: 0.2171)
85
+ [Batch 50/1177] Loss: 0.3348 (Sem: 0.1281, Det: 0.0975, Mat: 0.1092)
86
+ [Batch 100/1177] Loss: 0.3674 (Sem: 0.1876, Det: 0.0693, Mat: 0.1105)
87
+ [Batch 150/1177] Loss: 0.3275 (Sem: 0.0932, Det: 0.1201, Mat: 0.1142)
88
+ [Batch 200/1177] Loss: 0.2068 (Sem: 0.0410, Det: 0.0922, Mat: 0.0736)
89
+ [Batch 250/1177] Loss: 0.1706 (Sem: 0.0457, Det: 0.0692, Mat: 0.0558)
90
+ [Batch 300/1177] Loss: 0.1748 (Sem: 0.0673, Det: 0.0526, Mat: 0.0549)
91
+ [Batch 350/1177] Loss: 0.3631 (Sem: 0.2117, Det: 0.0662, Mat: 0.0852)
92
+ [Batch 400/1177] Loss: 0.1649 (Sem: 0.0395, Det: 0.0660, Mat: 0.0594)
93
+ [Batch 450/1177] Loss: 0.4294 (Sem: 0.2081, Det: 0.1020, Mat: 0.1193)
94
+ [Batch 500/1177] Loss: 0.1361 (Sem: 0.0286, Det: 0.0594, Mat: 0.0481)
95
+ [Batch 550/1177] Loss: 0.2998 (Sem: 0.1362, Det: 0.0752, Mat: 0.0884)
96
+ [Batch 600/1177] Loss: 0.2183 (Sem: 0.0433, Det: 0.0945, Mat: 0.0805)
97
+ [Batch 650/1177] Loss: 0.2202 (Sem: 0.0576, Det: 0.0836, Mat: 0.0790)
98
+ [Batch 700/1177] Loss: 0.2126 (Sem: 0.0564, Det: 0.0814, Mat: 0.0749)
99
+ [Batch 750/1177] Loss: 0.1483 (Sem: 0.0375, Det: 0.0569, Mat: 0.0539)
100
+ [Batch 800/1177] Loss: 0.1514 (Sem: 0.0400, Det: 0.0589, Mat: 0.0524)
101
+ [Batch 850/1177] Loss: 0.5702 (Sem: 0.2086, Det: 0.2018, Mat: 0.1598)
102
+ [Batch 900/1177] Loss: 0.1545 (Sem: 0.0347, Det: 0.0657, Mat: 0.0540)
103
+ [Batch 950/1177] Loss: 0.2103 (Sem: 0.0631, Det: 0.0755, Mat: 0.0717)
104
+ [Batch 1000/1177] Loss: 0.2197 (Sem: 0.0841, Det: 0.0579, Mat: 0.0777)
105
+ [Batch 1050/1177] Loss: 0.2303 (Sem: 0.0643, Det: 0.0816, Mat: 0.0844)
106
+ [Batch 1100/1177] Loss: 0.1060 (Sem: 0.0345, Det: 0.0367, Mat: 0.0349)
107
+ [Batch 1150/1177] Loss: 0.1661 (Sem: 0.0421, Det: 0.0723, Mat: 0.0517)
108
+ [Epoch 3 Summary] Avg Loss: 0.2433 (Sem: 0.0863, Det: 0.0782, Mat: 0.0788)
109
+ [Validation] Running on 500 samples...
110
+ [Viz] Saved validation preview -> ./output/epoch_03_val.png
111
+ [Validation] Avg L1 Loss: 0.0146
112
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_03.ckpt
113
+ [Best Model] New best! Val Loss: 0.0146 -> ./checkpoints/modnet_bn_best.ckpt
114
+
115
+ [Epoch 4/15] LR: 0.010000
116
+ [Batch 0/1177] Loss: 0.0827 (Sem: 0.0181, Det: 0.0351, Mat: 0.0295)
117
+ [Batch 50/1177] Loss: 0.2310 (Sem: 0.0909, Det: 0.0665, Mat: 0.0735)
118
+ [Batch 100/1177] Loss: 0.1708 (Sem: 0.0291, Det: 0.0757, Mat: 0.0660)
119
+ [Batch 150/1177] Loss: 0.1307 (Sem: 0.0291, Det: 0.0575, Mat: 0.0441)
120
+ [Batch 200/1177] Loss: 0.2774 (Sem: 0.0650, Det: 0.1123, Mat: 0.1000)
121
+ [Batch 250/1177] Loss: 0.2115 (Sem: 0.0612, Det: 0.0803, Mat: 0.0699)
122
+ [Batch 300/1177] Loss: 0.2892 (Sem: 0.1007, Det: 0.0876, Mat: 0.1009)
123
+ [Batch 350/1177] Loss: 0.0753 (Sem: 0.0191, Det: 0.0311, Mat: 0.0250)
124
+ [Batch 400/1177] Loss: 0.1863 (Sem: 0.0456, Det: 0.0829, Mat: 0.0578)
125
+ [Batch 450/1177] Loss: 0.2065 (Sem: 0.0891, Det: 0.0532, Mat: 0.0641)
126
+ [Batch 500/1177] Loss: 0.1365 (Sem: 0.0226, Det: 0.0661, Mat: 0.0478)
127
+ [Batch 550/1177] Loss: 0.1758 (Sem: 0.0561, Det: 0.0618, Mat: 0.0579)
128
+ [Batch 600/1177] Loss: 0.2570 (Sem: 0.1225, Det: 0.0621, Mat: 0.0724)
129
+ [Batch 650/1177] Loss: 0.1668 (Sem: 0.0528, Det: 0.0591, Mat: 0.0549)
130
+ [Batch 700/1177] Loss: 0.1546 (Sem: 0.0332, Det: 0.0663, Mat: 0.0550)
131
+ [Batch 750/1177] Loss: 0.2892 (Sem: 0.1583, Det: 0.0544, Mat: 0.0765)
132
+ [Batch 800/1177] Loss: 0.1323 (Sem: 0.0335, Det: 0.0531, Mat: 0.0458)
133
+ [Batch 850/1177] Loss: 0.1183 (Sem: 0.0316, Det: 0.0450, Mat: 0.0418)
134
+ [Batch 900/1177] Loss: 0.4255 (Sem: 0.2485, Det: 0.0678, Mat: 0.1093)
135
+ [Batch 950/1177] Loss: 0.3813 (Sem: 0.1409, Det: 0.1058, Mat: 0.1346)
136
+ [Batch 1000/1177] Loss: 0.5709 (Sem: 0.3169, Det: 0.1042, Mat: 0.1499)
137
+ [Batch 1050/1177] Loss: 0.1550 (Sem: 0.0417, Det: 0.0550, Mat: 0.0582)
138
+ [Batch 1100/1177] Loss: 0.2526 (Sem: 0.0643, Det: 0.0951, Mat: 0.0932)
139
+ [Batch 1150/1177] Loss: 0.1432 (Sem: 0.0300, Det: 0.0621, Mat: 0.0511)
140
+ [Epoch 4 Summary] Avg Loss: 0.2187 (Sem: 0.0741, Det: 0.0728, Mat: 0.0719)
141
+ [Validation] Running on 500 samples...
142
+ [Viz] Saved validation preview -> ./output/epoch_04_val.png
143
+ [Validation] Avg L1 Loss: 0.0132
144
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_04.ckpt
145
+ [Best Model] New best! Val Loss: 0.0132 -> ./checkpoints/modnet_bn_best.ckpt
146
+
147
+ [Epoch 5/15] LR: 0.010000
148
+ [Batch 0/1177] Loss: 0.1214 (Sem: 0.0206, Det: 0.0588, Mat: 0.0419)
149
+ [Batch 50/1177] Loss: 0.1518 (Sem: 0.0274, Det: 0.0655, Mat: 0.0589)
150
+ [Batch 100/1177] Loss: 0.0949 (Sem: 0.0141, Det: 0.0457, Mat: 0.0351)
151
+ [Batch 150/1177] Loss: 0.1150 (Sem: 0.0213, Det: 0.0505, Mat: 0.0432)
152
+ [Batch 200/1177] Loss: 0.2128 (Sem: 0.0683, Det: 0.0695, Mat: 0.0751)
153
+ [Batch 250/1177] Loss: 0.0679 (Sem: 0.0123, Det: 0.0321, Mat: 0.0235)
154
+ [Batch 300/1177] Loss: 0.5063 (Sem: 0.2084, Det: 0.1455, Mat: 0.1524)
155
+ [Batch 350/1177] Loss: 0.2356 (Sem: 0.0736, Det: 0.0868, Mat: 0.0752)
156
+ [Batch 400/1177] Loss: 0.1348 (Sem: 0.0256, Det: 0.0582, Mat: 0.0510)
157
+ [Batch 450/1177] Loss: 0.1133 (Sem: 0.0267, Det: 0.0472, Mat: 0.0394)
158
+ [Batch 500/1177] Loss: 0.1546 (Sem: 0.0437, Det: 0.0583, Mat: 0.0526)
159
+ [Batch 550/1177] Loss: 0.1125 (Sem: 0.0234, Det: 0.0487, Mat: 0.0404)
160
+ [Batch 600/1177] Loss: 0.1849 (Sem: 0.0371, Det: 0.0792, Mat: 0.0686)
161
+ [Batch 650/1177] Loss: 0.1791 (Sem: 0.0391, Det: 0.0732, Mat: 0.0668)
162
+ [Batch 700/1177] Loss: 0.1601 (Sem: 0.0421, Det: 0.0658, Mat: 0.0522)
163
+ [Batch 750/1177] Loss: 0.2811 (Sem: 0.1308, Det: 0.0698, Mat: 0.0806)
164
+ [Batch 800/1177] Loss: 0.2629 (Sem: 0.0809, Det: 0.0931, Mat: 0.0889)
165
+ [Batch 850/1177] Loss: 0.1495 (Sem: 0.0325, Det: 0.0629, Mat: 0.0542)
166
+ [Batch 900/1177] Loss: 0.1961 (Sem: 0.0701, Det: 0.0630, Mat: 0.0630)
167
+ [Batch 950/1177] Loss: 0.1879 (Sem: 0.0379, Det: 0.0843, Mat: 0.0658)
168
+ [Batch 1000/1177] Loss: 0.1042 (Sem: 0.0264, Det: 0.0416, Mat: 0.0362)
169
+ [Batch 1050/1177] Loss: 0.1907 (Sem: 0.0458, Det: 0.0758, Mat: 0.0692)
170
+ [Batch 1100/1177] Loss: 0.0941 (Sem: 0.0202, Det: 0.0404, Mat: 0.0335)
171
+ [Batch 1150/1177] Loss: 0.1857 (Sem: 0.0553, Det: 0.0685, Mat: 0.0619)
172
+ [Epoch 5 Summary] Avg Loss: 0.2009 (Sem: 0.0670, Det: 0.0675, Mat: 0.0664)
173
+ [Validation] Running on 500 samples...
174
+ [Viz] Saved validation preview -> ./output/epoch_05_val.png
175
+ [Validation] Avg L1 Loss: 0.0129
176
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_05.ckpt
177
+ [Best Model] New best! Val Loss: 0.0129 -> ./checkpoints/modnet_bn_best.ckpt
178
+
179
+ [Epoch 6/15] LR: 0.001000
180
+ [Batch 0/1177] Loss: 0.1599 (Sem: 0.0456, Det: 0.0647, Mat: 0.0496)
181
+ [Batch 50/1177] Loss: 0.1104 (Sem: 0.0229, Det: 0.0486, Mat: 0.0390)
182
+ [Batch 100/1177] Loss: 0.1206 (Sem: 0.0359, Det: 0.0458, Mat: 0.0388)
183
+ [Batch 150/1177] Loss: 0.1298 (Sem: 0.0381, Det: 0.0475, Mat: 0.0443)
184
+ [Batch 200/1177] Loss: 0.1002 (Sem: 0.0147, Det: 0.0468, Mat: 0.0387)
185
+ [Batch 250/1177] Loss: 0.1609 (Sem: 0.0447, Det: 0.0583, Mat: 0.0579)
186
+ [Batch 300/1177] Loss: 0.1235 (Sem: 0.0238, Det: 0.0546, Mat: 0.0450)
187
+ [Batch 350/1177] Loss: 0.0934 (Sem: 0.0132, Det: 0.0458, Mat: 0.0344)
188
+ [Batch 400/1177] Loss: 0.1266 (Sem: 0.0257, Det: 0.0536, Mat: 0.0473)
189
+ [Batch 450/1177] Loss: 0.1009 (Sem: 0.0218, Det: 0.0440, Mat: 0.0351)
190
+ [Batch 500/1177] Loss: 0.1031 (Sem: 0.0213, Det: 0.0452, Mat: 0.0365)
191
+ [Batch 550/1177] Loss: 0.1924 (Sem: 0.0393, Det: 0.0800, Mat: 0.0731)
192
+ [Batch 600/1177] Loss: 0.3602 (Sem: 0.1339, Det: 0.1045, Mat: 0.1218)
193
+ [Batch 650/1177] Loss: 0.1745 (Sem: 0.0351, Det: 0.0709, Mat: 0.0685)
194
+ [Batch 700/1177] Loss: 0.1811 (Sem: 0.0381, Det: 0.0768, Mat: 0.0663)
195
+ [Batch 750/1177] Loss: 0.3088 (Sem: 0.1219, Det: 0.0767, Mat: 0.1102)
196
+ [Batch 800/1177] Loss: 0.1800 (Sem: 0.0444, Det: 0.0689, Mat: 0.0667)
197
+ [Batch 850/1177] Loss: 0.1749 (Sem: 0.0325, Det: 0.0768, Mat: 0.0655)
198
+ [Batch 900/1177] Loss: 0.1878 (Sem: 0.0446, Det: 0.0704, Mat: 0.0728)
199
+ [Batch 950/1177] Loss: 0.0814 (Sem: 0.0165, Det: 0.0351, Mat: 0.0297)
200
+ [Batch 1000/1177] Loss: 0.1459 (Sem: 0.0274, Det: 0.0624, Mat: 0.0561)
201
+ [Batch 1050/1177] Loss: 0.1162 (Sem: 0.0162, Det: 0.0561, Mat: 0.0439)
202
+ [Batch 1100/1177] Loss: 0.0799 (Sem: 0.0188, Det: 0.0319, Mat: 0.0292)
203
+ [Batch 1150/1177] Loss: 0.0882 (Sem: 0.0223, Det: 0.0324, Mat: 0.0336)
204
+ [Epoch 6 Summary] Avg Loss: 0.1552 (Sem: 0.0405, Det: 0.0603, Mat: 0.0544)
205
+ [Validation] Running on 500 samples...
206
+ [Viz] Saved validation preview -> ./output/epoch_06_val.png
207
+ [Validation] Avg L1 Loss: 0.0096
208
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_06.ckpt
209
+ [Best Model] New best! Val Loss: 0.0096 -> ./checkpoints/modnet_bn_best.ckpt
210
+
211
+ [Epoch 7/15] LR: 0.001000
212
+ [Batch 0/1177] Loss: 0.1311 (Sem: 0.0200, Det: 0.0606, Mat: 0.0504)
213
+ [Batch 50/1177] Loss: 0.0862 (Sem: 0.0204, Det: 0.0358, Mat: 0.0299)
214
+ [Batch 100/1177] Loss: 0.2479 (Sem: 0.0432, Det: 0.1132, Mat: 0.0914)
215
+ [Batch 150/1177] Loss: 0.0844 (Sem: 0.0161, Det: 0.0391, Mat: 0.0292)
216
+ [Batch 200/1177] Loss: 0.1277 (Sem: 0.0384, Det: 0.0462, Mat: 0.0432)
217
+ [Batch 250/1177] Loss: 0.2444 (Sem: 0.0882, Det: 0.0785, Mat: 0.0777)
218
+ [Batch 300/1177] Loss: 0.1259 (Sem: 0.0257, Det: 0.0569, Mat: 0.0432)
219
+ [Batch 350/1177] Loss: 0.1280 (Sem: 0.0203, Det: 0.0596, Mat: 0.0481)
220
+ [Batch 400/1177] Loss: 0.4194 (Sem: 0.2144, Det: 0.0955, Mat: 0.1094)
221
+ [Batch 450/1177] Loss: 0.0813 (Sem: 0.0226, Det: 0.0330, Mat: 0.0256)
222
+ [Batch 500/1177] Loss: 0.1102 (Sem: 0.0262, Det: 0.0460, Mat: 0.0380)
223
+ [Batch 550/1177] Loss: 0.0854 (Sem: 0.0196, Det: 0.0350, Mat: 0.0308)
224
+ [Batch 600/1177] Loss: 0.1876 (Sem: 0.0684, Det: 0.0577, Mat: 0.0615)
225
+ [Batch 650/1177] Loss: 0.0978 (Sem: 0.0184, Det: 0.0439, Mat: 0.0355)
226
+ [Batch 700/1177] Loss: 0.1148 (Sem: 0.0332, Det: 0.0453, Mat: 0.0363)
227
+ [Batch 750/1177] Loss: 0.1572 (Sem: 0.0353, Det: 0.0645, Mat: 0.0573)
228
+ [Batch 800/1177] Loss: 0.1961 (Sem: 0.0483, Det: 0.0811, Mat: 0.0667)
229
+ [Batch 850/1177] Loss: 0.1393 (Sem: 0.0228, Det: 0.0642, Mat: 0.0523)
230
+ [Batch 900/1177] Loss: 0.1079 (Sem: 0.0207, Det: 0.0469, Mat: 0.0404)
231
+ [Batch 950/1177] Loss: 0.1388 (Sem: 0.0276, Det: 0.0634, Mat: 0.0479)
232
+ [Batch 1000/1177] Loss: 0.0788 (Sem: 0.0103, Det: 0.0378, Mat: 0.0307)
233
+ [Batch 1050/1177] Loss: 0.1026 (Sem: 0.0277, Det: 0.0407, Mat: 0.0342)
234
+ [Batch 1100/1177] Loss: 0.1016 (Sem: 0.0209, Det: 0.0450, Mat: 0.0357)
235
+ [Batch 1150/1177] Loss: 0.1552 (Sem: 0.0376, Det: 0.0618, Mat: 0.0558)
236
+ [Epoch 7 Summary] Avg Loss: 0.1424 (Sem: 0.0343, Det: 0.0576, Mat: 0.0506)
237
+ [Validation] Running on 500 samples...
238
+ [Viz] Saved validation preview -> ./output/epoch_07_val.png
239
+ [Validation] Avg L1 Loss: 0.0095
240
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_07.ckpt
241
+ [Best Model] New best! Val Loss: 0.0095 -> ./checkpoints/modnet_bn_best.ckpt
242
+
243
+ [Epoch 8/15] LR: 0.001000
244
+ [Batch 0/1177] Loss: 0.1112 (Sem: 0.0163, Det: 0.0530, Mat: 0.0419)
245
+ [Batch 50/1177] Loss: 0.1347 (Sem: 0.0280, Det: 0.0578, Mat: 0.0489)
246
+ [Batch 100/1177] Loss: 0.1213 (Sem: 0.0248, Det: 0.0512, Mat: 0.0454)
247
+ [Batch 150/1177] Loss: 0.0851 (Sem: 0.0125, Det: 0.0417, Mat: 0.0309)
248
+ [Batch 200/1177] Loss: 0.0850 (Sem: 0.0132, Det: 0.0401, Mat: 0.0316)
249
+ [Batch 250/1177] Loss: 0.2751 (Sem: 0.0472, Det: 0.1272, Mat: 0.1007)
250
+ [Batch 300/1177] Loss: 0.2679 (Sem: 0.0589, Det: 0.1084, Mat: 0.1006)
251
+ [Batch 350/1177] Loss: 0.1280 (Sem: 0.0199, Det: 0.0602, Mat: 0.0479)
252
+ [Batch 400/1177] Loss: 0.1979 (Sem: 0.0370, Det: 0.0849, Mat: 0.0760)
253
+ [Batch 450/1177] Loss: 0.1556 (Sem: 0.0202, Det: 0.0769, Mat: 0.0584)
254
+ [Batch 500/1177] Loss: 0.0949 (Sem: 0.0159, Det: 0.0439, Mat: 0.0350)
255
+ [Batch 550/1177] Loss: 0.0895 (Sem: 0.0155, Det: 0.0419, Mat: 0.0321)
256
+ [Batch 600/1177] Loss: 0.1501 (Sem: 0.0294, Det: 0.0636, Mat: 0.0571)
257
+ [Batch 650/1177] Loss: 0.1011 (Sem: 0.0177, Det: 0.0468, Mat: 0.0366)
258
+ [Batch 700/1177] Loss: 0.1625 (Sem: 0.0230, Det: 0.0774, Mat: 0.0621)
259
+ [Batch 750/1177] Loss: 0.0882 (Sem: 0.0142, Det: 0.0396, Mat: 0.0344)
260
+ [Batch 800/1177] Loss: 0.1165 (Sem: 0.0216, Det: 0.0536, Mat: 0.0412)
261
+ [Batch 850/1177] Loss: 0.1238 (Sem: 0.0423, Det: 0.0401, Mat: 0.0414)
262
+ [Batch 900/1177] Loss: 0.1029 (Sem: 0.0164, Det: 0.0483, Mat: 0.0382)
263
+ [Batch 950/1177] Loss: 0.1330 (Sem: 0.0163, Det: 0.0646, Mat: 0.0520)
264
+ [Batch 1000/1177] Loss: 0.1224 (Sem: 0.0123, Det: 0.0602, Mat: 0.0500)
265
+ [Batch 1050/1177] Loss: 0.0978 (Sem: 0.0191, Det: 0.0416, Mat: 0.0372)
266
+ [Batch 1100/1177] Loss: 0.0844 (Sem: 0.0178, Det: 0.0371, Mat: 0.0294)
267
+ [Batch 1150/1177] Loss: 0.0952 (Sem: 0.0116, Det: 0.0477, Mat: 0.0358)
268
+ [Epoch 8 Summary] Avg Loss: 0.1367 (Sem: 0.0312, Det: 0.0565, Mat: 0.0490)
269
+ [Validation] Running on 500 samples...
270
+ [Viz] Saved validation preview -> ./output/epoch_08_val.png
271
+ [Validation] Avg L1 Loss: 0.0091
272
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_08.ckpt
273
+ [Best Model] New best! Val Loss: 0.0091 -> ./checkpoints/modnet_bn_best.ckpt
274
+
275
+ [Epoch 9/15] LR: 0.001000
276
+ [Batch 0/1177] Loss: 0.1126 (Sem: 0.0221, Det: 0.0480, Mat: 0.0425)
277
+ [Batch 50/1177] Loss: 0.0933 (Sem: 0.0263, Det: 0.0352, Mat: 0.0319)
278
+ [Batch 100/1177] Loss: 0.2441 (Sem: 0.0937, Det: 0.0776, Mat: 0.0728)
279
+ [Batch 150/1177] Loss: 0.1644 (Sem: 0.0310, Det: 0.0699, Mat: 0.0635)
280
+ [Batch 200/1177] Loss: 0.0737 (Sem: 0.0157, Det: 0.0332, Mat: 0.0248)
281
+ [Batch 250/1177] Loss: 0.1503 (Sem: 0.0264, Det: 0.0665, Mat: 0.0573)
282
+ [Batch 300/1177] Loss: 0.1200 (Sem: 0.0181, Det: 0.0553, Mat: 0.0465)
283
+ [Batch 350/1177] Loss: 0.0743 (Sem: 0.0124, Det: 0.0360, Mat: 0.0259)
284
+ [Batch 400/1177] Loss: 0.1130 (Sem: 0.0129, Det: 0.0541, Mat: 0.0461)
285
+ [Batch 450/1177] Loss: 0.2347 (Sem: 0.0718, Det: 0.0804, Mat: 0.0825)
286
+ [Batch 500/1177] Loss: 0.1405 (Sem: 0.0277, Det: 0.0589, Mat: 0.0539)
287
+ [Batch 550/1177] Loss: 0.1272 (Sem: 0.0401, Det: 0.0423, Mat: 0.0448)
288
+ [Batch 600/1177] Loss: 0.1262 (Sem: 0.0197, Det: 0.0587, Mat: 0.0478)
289
+ [Batch 650/1177] Loss: 0.0959 (Sem: 0.0140, Det: 0.0465, Mat: 0.0354)
290
+ [Batch 700/1177] Loss: 0.1048 (Sem: 0.0240, Det: 0.0450, Mat: 0.0358)
291
+ [Batch 750/1177] Loss: 0.1767 (Sem: 0.0271, Det: 0.0808, Mat: 0.0689)
292
+ [Batch 800/1177] Loss: 0.1087 (Sem: 0.0255, Det: 0.0465, Mat: 0.0367)
293
+ [Batch 850/1177] Loss: 0.1343 (Sem: 0.0250, Det: 0.0605, Mat: 0.0488)
294
+ [Batch 900/1177] Loss: 0.1146 (Sem: 0.0166, Det: 0.0536, Mat: 0.0445)
295
+ [Batch 950/1177] Loss: 0.0977 (Sem: 0.0178, Det: 0.0428, Mat: 0.0371)
296
+ [Batch 1000/1177] Loss: 0.0752 (Sem: 0.0122, Det: 0.0329, Mat: 0.0300)
297
+ [Batch 1050/1177] Loss: 0.1101 (Sem: 0.0193, Det: 0.0494, Mat: 0.0414)
298
+ [Batch 1100/1177] Loss: 0.1051 (Sem: 0.0204, Det: 0.0448, Mat: 0.0399)
299
+ [Batch 1150/1177] Loss: 0.0834 (Sem: 0.0143, Det: 0.0383, Mat: 0.0308)
300
+ [Epoch 9 Summary] Avg Loss: 0.1339 (Sem: 0.0301, Det: 0.0558, Mat: 0.0481)
301
+ [Validation] Running on 500 samples...
302
+ [Viz] Saved validation preview -> ./output/epoch_09_val.png
303
+ [Validation] Avg L1 Loss: 0.0092
304
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_09.ckpt
305
+
306
+ [Epoch 10/15] LR: 0.001000
307
+ [Batch 0/1177] Loss: 0.1354 (Sem: 0.0334, Det: 0.0540, Mat: 0.0480)
308
+ [Batch 50/1177] Loss: 0.1229 (Sem: 0.0179, Det: 0.0596, Mat: 0.0454)
309
+ [Batch 100/1177] Loss: 0.1102 (Sem: 0.0188, Det: 0.0518, Mat: 0.0396)
310
+ [Batch 150/1177] Loss: 0.1122 (Sem: 0.0218, Det: 0.0499, Mat: 0.0404)
311
+ [Batch 200/1177] Loss: 0.1078 (Sem: 0.0171, Det: 0.0488, Mat: 0.0419)
312
+ [Batch 250/1177] Loss: 0.0920 (Sem: 0.0105, Det: 0.0471, Mat: 0.0343)
313
+ [Batch 300/1177] Loss: 0.1229 (Sem: 0.0188, Det: 0.0579, Mat: 0.0461)
314
+ [Batch 350/1177] Loss: 0.1602 (Sem: 0.0353, Det: 0.0647, Mat: 0.0601)
315
+ [Batch 400/1177] Loss: 0.1096 (Sem: 0.0204, Det: 0.0478, Mat: 0.0415)
316
+ [Batch 450/1177] Loss: 0.0834 (Sem: 0.0153, Det: 0.0379, Mat: 0.0301)
317
+ [Batch 500/1177] Loss: 0.1279 (Sem: 0.0259, Det: 0.0562, Mat: 0.0457)
318
+ [Batch 550/1177] Loss: 0.1270 (Sem: 0.0197, Det: 0.0600, Mat: 0.0473)
319
+ [Batch 600/1177] Loss: 0.2131 (Sem: 0.0947, Det: 0.0541, Mat: 0.0642)
320
+ [Batch 650/1177] Loss: 0.1228 (Sem: 0.0239, Det: 0.0535, Mat: 0.0455)
321
+ [Batch 700/1177] Loss: 0.0653 (Sem: 0.0109, Det: 0.0309, Mat: 0.0235)
322
+ [Batch 750/1177] Loss: 0.1415 (Sem: 0.0299, Det: 0.0602, Mat: 0.0513)
323
+ [Batch 800/1177] Loss: 0.0972 (Sem: 0.0157, Det: 0.0431, Mat: 0.0384)
324
+ [Batch 850/1177] Loss: 0.1195 (Sem: 0.0185, Det: 0.0596, Mat: 0.0415)
325
+ [Batch 900/1177] Loss: 0.1018 (Sem: 0.0180, Det: 0.0479, Mat: 0.0359)
326
+ [Batch 950/1177] Loss: 0.2063 (Sem: 0.0700, Det: 0.0658, Mat: 0.0705)
327
+ [Batch 1000/1177] Loss: 0.1240 (Sem: 0.0219, Det: 0.0565, Mat: 0.0456)
328
+ [Batch 1050/1177] Loss: 0.1373 (Sem: 0.0288, Det: 0.0567, Mat: 0.0518)
329
+ [Batch 1100/1177] Loss: 0.1684 (Sem: 0.0336, Det: 0.0718, Mat: 0.0631)
330
+ [Batch 1150/1177] Loss: 0.0990 (Sem: 0.0189, Det: 0.0427, Mat: 0.0374)
331
+ [Epoch 10 Summary] Avg Loss: 0.1310 (Sem: 0.0287, Det: 0.0551, Mat: 0.0472)
332
+ [Validation] Running on 500 samples...
333
+ [Viz] Saved validation preview -> ./output/epoch_10_val.png
334
+ [Validation] Avg L1 Loss: 0.0088
335
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_10.ckpt
336
+ [Best Model] New best! Val Loss: 0.0088 -> ./checkpoints/modnet_bn_best.ckpt
337
+
338
+ [Epoch 11/15] LR: 0.000100
339
+ [Batch 0/1177] Loss: 0.1061 (Sem: 0.0168, Det: 0.0516, Mat: 0.0377)
340
+ [Batch 50/1177] Loss: 0.1651 (Sem: 0.0637, Det: 0.0517, Mat: 0.0497)
341
+ [Batch 100/1177] Loss: 0.1086 (Sem: 0.0170, Det: 0.0521, Mat: 0.0395)
342
+ [Batch 150/1177] Loss: 0.1220 (Sem: 0.0197, Det: 0.0568, Mat: 0.0454)
343
+ [Batch 200/1177] Loss: 0.1411 (Sem: 0.0540, Det: 0.0426, Mat: 0.0444)
344
+ [Batch 250/1177] Loss: 0.0749 (Sem: 0.0186, Det: 0.0304, Mat: 0.0259)
345
+ [Batch 300/1177] Loss: 0.0814 (Sem: 0.0118, Det: 0.0397, Mat: 0.0299)
346
+ [Batch 350/1177] Loss: 0.1395 (Sem: 0.0357, Det: 0.0552, Mat: 0.0486)
347
+ [Batch 400/1177] Loss: 0.1154 (Sem: 0.0184, Det: 0.0513, Mat: 0.0457)
348
+ [Batch 450/1177] Loss: 0.1178 (Sem: 0.0186, Det: 0.0560, Mat: 0.0432)
349
+ [Batch 500/1177] Loss: 0.0858 (Sem: 0.0235, Det: 0.0338, Mat: 0.0285)
350
+ [Batch 550/1177] Loss: 0.0621 (Sem: 0.0116, Det: 0.0292, Mat: 0.0213)
351
+ [Batch 600/1177] Loss: 0.1035 (Sem: 0.0214, Det: 0.0417, Mat: 0.0404)
352
+ [Batch 650/1177] Loss: 0.2149 (Sem: 0.0787, Det: 0.0679, Mat: 0.0682)
353
+ [Batch 700/1177] Loss: 0.1248 (Sem: 0.0236, Det: 0.0574, Mat: 0.0438)
354
+ [Batch 750/1177] Loss: 0.2053 (Sem: 0.0513, Det: 0.0811, Mat: 0.0728)
355
+ [Batch 800/1177] Loss: 0.1912 (Sem: 0.0528, Det: 0.0718, Mat: 0.0667)
356
+ [Batch 850/1177] Loss: 0.1268 (Sem: 0.0281, Det: 0.0530, Mat: 0.0457)
357
+ [Batch 900/1177] Loss: 0.1140 (Sem: 0.0159, Det: 0.0563, Mat: 0.0419)
358
+ [Batch 950/1177] Loss: 0.1058 (Sem: 0.0186, Det: 0.0479, Mat: 0.0393)
359
+ [Batch 1000/1177] Loss: 0.0827 (Sem: 0.0149, Det: 0.0360, Mat: 0.0318)
360
+ [Batch 1050/1177] Loss: 0.2866 (Sem: 0.0593, Det: 0.1141, Mat: 0.1132)
361
+ [Batch 1100/1177] Loss: 0.1026 (Sem: 0.0192, Det: 0.0458, Mat: 0.0377)
362
+ [Batch 1150/1177] Loss: 0.1267 (Sem: 0.0199, Det: 0.0611, Mat: 0.0456)
363
+ [Epoch 11 Summary] Avg Loss: 0.1296 (Sem: 0.0278, Det: 0.0549, Mat: 0.0469)
364
+ [Validation] Running on 500 samples...
365
+ [Viz] Saved validation preview -> ./output/epoch_11_val.png
366
+ [Validation] Avg L1 Loss: 0.0088
367
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_11.ckpt
368
+
369
+ [Epoch 12/15] LR: 0.000100
370
+ [Batch 0/1177] Loss: 0.1568 (Sem: 0.0611, Det: 0.0467, Mat: 0.0489)
371
+ [Batch 50/1177] Loss: 0.0905 (Sem: 0.0147, Det: 0.0444, Mat: 0.0314)
372
+ [Batch 100/1177] Loss: 0.1114 (Sem: 0.0147, Det: 0.0544, Mat: 0.0423)
373
+ [Batch 150/1177] Loss: 0.1022 (Sem: 0.0168, Det: 0.0494, Mat: 0.0360)
374
+ [Batch 200/1177] Loss: 0.1142 (Sem: 0.0206, Det: 0.0527, Mat: 0.0409)
375
+ [Batch 250/1177] Loss: 0.0810 (Sem: 0.0139, Det: 0.0371, Mat: 0.0300)
376
+ [Batch 300/1177] Loss: 0.2477 (Sem: 0.0846, Det: 0.0812, Mat: 0.0818)
377
+ [Batch 350/1177] Loss: 0.1004 (Sem: 0.0175, Det: 0.0482, Mat: 0.0346)
378
+ [Batch 400/1177] Loss: 0.1059 (Sem: 0.0168, Det: 0.0489, Mat: 0.0402)
379
+ [Batch 450/1177] Loss: 0.1615 (Sem: 0.0232, Det: 0.0742, Mat: 0.0640)
380
+ [Batch 500/1177] Loss: 0.0699 (Sem: 0.0101, Det: 0.0345, Mat: 0.0253)
381
+ [Batch 550/1177] Loss: 0.0863 (Sem: 0.0157, Det: 0.0408, Mat: 0.0298)
382
+ [Batch 600/1177] Loss: 0.0979 (Sem: 0.0221, Det: 0.0398, Mat: 0.0360)
383
+ [Batch 650/1177] Loss: 0.1045 (Sem: 0.0178, Det: 0.0509, Mat: 0.0357)
384
+ [Batch 700/1177] Loss: 0.1131 (Sem: 0.0141, Det: 0.0555, Mat: 0.0434)
385
+ [Batch 750/1177] Loss: 0.1610 (Sem: 0.0243, Det: 0.0816, Mat: 0.0550)
386
+ [Batch 800/1177] Loss: 0.1243 (Sem: 0.0345, Det: 0.0484, Mat: 0.0415)
387
+ [Batch 850/1177] Loss: 0.1620 (Sem: 0.0614, Det: 0.0478, Mat: 0.0527)
388
+ [Batch 900/1177] Loss: 0.1808 (Sem: 0.0270, Det: 0.0865, Mat: 0.0673)
389
+ [Batch 950/1177] Loss: 0.0988 (Sem: 0.0155, Det: 0.0451, Mat: 0.0382)
390
+ [Batch 1000/1177] Loss: 0.1515 (Sem: 0.0239, Det: 0.0705, Mat: 0.0571)
391
+ [Batch 1050/1177] Loss: 0.2205 (Sem: 0.0544, Det: 0.0794, Mat: 0.0867)
392
+ [Batch 1100/1177] Loss: 0.1176 (Sem: 0.0199, Det: 0.0545, Mat: 0.0432)
393
+ [Batch 1150/1177] Loss: 0.0722 (Sem: 0.0118, Det: 0.0324, Mat: 0.0281)
394
+ [Epoch 12 Summary] Avg Loss: 0.1277 (Sem: 0.0271, Det: 0.0544, Mat: 0.0462)
395
+ [Validation] Running on 500 samples...
396
+ [Viz] Saved validation preview -> ./output/epoch_12_val.png
397
+ [Validation] Avg L1 Loss: 0.0087
398
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_12.ckpt
399
+ [Best Model] New best! Val Loss: 0.0087 -> ./checkpoints/modnet_bn_best.ckpt
400
+
401
+ [Epoch 13/15] LR: 0.000100
402
+ [Batch 0/1177] Loss: 0.1342 (Sem: 0.0255, Det: 0.0604, Mat: 0.0483)
403
+ [Batch 50/1177] Loss: 0.1127 (Sem: 0.0251, Det: 0.0470, Mat: 0.0405)
404
+ [Batch 100/1177] Loss: 0.1017 (Sem: 0.0163, Det: 0.0503, Mat: 0.0351)
405
+ [Batch 150/1177] Loss: 0.0937 (Sem: 0.0170, Det: 0.0416, Mat: 0.0352)
406
+ [Batch 200/1177] Loss: 0.1376 (Sem: 0.0283, Det: 0.0602, Mat: 0.0490)
407
+ [Batch 250/1177] Loss: 0.0905 (Sem: 0.0118, Det: 0.0460, Mat: 0.0327)
408
+ [Batch 300/1177] Loss: 0.1533 (Sem: 0.0186, Det: 0.0758, Mat: 0.0589)
409
+ [Batch 350/1177] Loss: 0.1329 (Sem: 0.0223, Det: 0.0614, Mat: 0.0492)
410
+ [Batch 400/1177] Loss: 0.0914 (Sem: 0.0132, Det: 0.0461, Mat: 0.0321)
411
+ [Batch 450/1177] Loss: 0.0898 (Sem: 0.0141, Det: 0.0421, Mat: 0.0337)
412
+ [Batch 500/1177] Loss: 0.1923 (Sem: 0.0650, Det: 0.0635, Mat: 0.0638)
413
+ [Batch 550/1177] Loss: 0.1193 (Sem: 0.0163, Det: 0.0573, Mat: 0.0456)
414
+ [Batch 600/1177] Loss: 0.0977 (Sem: 0.0135, Det: 0.0479, Mat: 0.0363)
415
+ [Batch 650/1177] Loss: 0.1914 (Sem: 0.0493, Det: 0.0747, Mat: 0.0673)
416
+ [Batch 700/1177] Loss: 0.1039 (Sem: 0.0149, Det: 0.0488, Mat: 0.0402)
417
+ [Batch 750/1177] Loss: 0.1695 (Sem: 0.0220, Det: 0.0864, Mat: 0.0611)
418
+ [Batch 800/1177] Loss: 0.0964 (Sem: 0.0173, Det: 0.0449, Mat: 0.0342)
419
+ [Batch 850/1177] Loss: 0.1275 (Sem: 0.0178, Det: 0.0590, Mat: 0.0507)
420
+ [Batch 900/1177] Loss: 0.1693 (Sem: 0.0710, Det: 0.0475, Mat: 0.0508)
421
+ [Batch 950/1177] Loss: 0.1285 (Sem: 0.0151, Det: 0.0648, Mat: 0.0486)
422
+ [Batch 1000/1177] Loss: 0.1603 (Sem: 0.0310, Det: 0.0679, Mat: 0.0615)
423
+ [Batch 1050/1177] Loss: 0.1809 (Sem: 0.0419, Det: 0.0713, Mat: 0.0677)
424
+ [Batch 1100/1177] Loss: 0.2207 (Sem: 0.0685, Det: 0.0783, Mat: 0.0739)
425
+ [Batch 1150/1177] Loss: 0.1554 (Sem: 0.0457, Det: 0.0577, Mat: 0.0520)
426
+ [Epoch 13 Summary] Avg Loss: 0.1283 (Sem: 0.0277, Det: 0.0542, Mat: 0.0464)
427
+ [Validation] Running on 500 samples...
428
+ [Viz] Saved validation preview -> ./output/epoch_13_val.png
429
+ [Validation] Avg L1 Loss: 0.0088
430
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_13.ckpt
431
+
432
+ [Epoch 14/15] LR: 0.000100
433
+ [Batch 0/1177] Loss: 0.3032 (Sem: 0.1008, Det: 0.0977, Mat: 0.1047)
434
+ [Batch 50/1177] Loss: 0.0921 (Sem: 0.0116, Det: 0.0480, Mat: 0.0325)
435
+ [Batch 100/1177] Loss: 0.2305 (Sem: 0.1090, Det: 0.0583, Mat: 0.0632)
436
+ [Batch 150/1177] Loss: 0.1198 (Sem: 0.0205, Det: 0.0553, Mat: 0.0440)
437
+ [Batch 200/1177] Loss: 0.1206 (Sem: 0.0292, Det: 0.0469, Mat: 0.0445)
438
+ [Batch 250/1177] Loss: 0.0952 (Sem: 0.0202, Det: 0.0394, Mat: 0.0356)
439
+ [Batch 300/1177] Loss: 0.0734 (Sem: 0.0149, Det: 0.0327, Mat: 0.0258)
440
+ [Batch 350/1177] Loss: 0.1427 (Sem: 0.0216, Det: 0.0675, Mat: 0.0536)
441
+ [Batch 400/1177] Loss: 0.2862 (Sem: 0.0930, Det: 0.0935, Mat: 0.0997)
442
+ [Batch 450/1177] Loss: 0.0582 (Sem: 0.0074, Det: 0.0297, Mat: 0.0211)
443
+ [Batch 500/1177] Loss: 0.0825 (Sem: 0.0112, Det: 0.0410, Mat: 0.0303)
444
+ [Batch 550/1177] Loss: 0.1416 (Sem: 0.0302, Det: 0.0632, Mat: 0.0482)
445
+ [Batch 600/1177] Loss: 0.0810 (Sem: 0.0122, Det: 0.0386, Mat: 0.0302)
446
+ [Batch 650/1177] Loss: 0.1084 (Sem: 0.0108, Det: 0.0554, Mat: 0.0422)
447
+ [Batch 700/1177] Loss: 0.0789 (Sem: 0.0150, Det: 0.0344, Mat: 0.0296)
448
+ [Batch 750/1177] Loss: 0.0731 (Sem: 0.0093, Det: 0.0345, Mat: 0.0293)
449
+ [Batch 800/1177] Loss: 0.1150 (Sem: 0.0147, Det: 0.0557, Mat: 0.0446)
450
+ [Batch 850/1177] Loss: 0.1733 (Sem: 0.0572, Det: 0.0534, Mat: 0.0626)
451
+ [Batch 900/1177] Loss: 0.0752 (Sem: 0.0093, Det: 0.0369, Mat: 0.0289)
452
+ [Batch 950/1177] Loss: 0.1593 (Sem: 0.0377, Det: 0.0645, Mat: 0.0572)
453
+ [Batch 1000/1177] Loss: 0.1806 (Sem: 0.0333, Det: 0.0798, Mat: 0.0675)
454
+ [Batch 1050/1177] Loss: 0.0952 (Sem: 0.0158, Det: 0.0449, Mat: 0.0345)
455
+ [Batch 1100/1177] Loss: 0.2434 (Sem: 0.0330, Det: 0.1123, Mat: 0.0981)
456
+ [Batch 1150/1177] Loss: 0.1246 (Sem: 0.0209, Det: 0.0568, Mat: 0.0469)
457
+ [Epoch 14 Summary] Avg Loss: 0.1274 (Sem: 0.0270, Det: 0.0544, Mat: 0.0461)
458
+ [Validation] Running on 500 samples...
459
+ [Viz] Saved validation preview -> ./output/epoch_14_val.png
460
+ [Validation] Avg L1 Loss: 0.0086
461
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_14.ckpt
462
+ [Best Model] New best! Val Loss: 0.0086 -> ./checkpoints/modnet_bn_best.ckpt
463
+
464
+ [Epoch 15/15] LR: 0.000100
465
+ [Batch 0/1177] Loss: 0.1131 (Sem: 0.0303, Det: 0.0454, Mat: 0.0374)
466
+ [Batch 50/1177] Loss: 0.0959 (Sem: 0.0107, Det: 0.0468, Mat: 0.0383)
467
+ [Batch 100/1177] Loss: 0.0927 (Sem: 0.0098, Det: 0.0474, Mat: 0.0355)
468
+ [Batch 150/1177] Loss: 0.0938 (Sem: 0.0299, Det: 0.0331, Mat: 0.0308)
469
+ [Batch 200/1177] Loss: 0.1252 (Sem: 0.0185, Det: 0.0617, Mat: 0.0451)
470
+ [Batch 250/1177] Loss: 0.1098 (Sem: 0.0188, Det: 0.0498, Mat: 0.0412)
471
+ [Batch 300/1177] Loss: 0.1272 (Sem: 0.0300, Det: 0.0489, Mat: 0.0482)
472
+ [Batch 350/1177] Loss: 0.1634 (Sem: 0.0341, Det: 0.0678, Mat: 0.0615)
473
+ [Batch 400/1177] Loss: 0.0812 (Sem: 0.0139, Det: 0.0398, Mat: 0.0276)
474
+ [Batch 450/1177] Loss: 0.0915 (Sem: 0.0119, Det: 0.0453, Mat: 0.0343)
475
+ [Batch 500/1177] Loss: 0.2136 (Sem: 0.0703, Det: 0.0688, Mat: 0.0744)
476
+ [Batch 550/1177] Loss: 0.1080 (Sem: 0.0172, Det: 0.0489, Mat: 0.0419)
477
+ [Batch 600/1177] Loss: 0.1365 (Sem: 0.0175, Det: 0.0656, Mat: 0.0534)
478
+ [Batch 650/1177] Loss: 0.0881 (Sem: 0.0103, Det: 0.0454, Mat: 0.0324)
479
+ [Batch 700/1177] Loss: 0.0756 (Sem: 0.0124, Det: 0.0365, Mat: 0.0267)
480
+ [Batch 750/1177] Loss: 0.1101 (Sem: 0.0145, Det: 0.0547, Mat: 0.0409)
481
+ [Batch 800/1177] Loss: 0.1533 (Sem: 0.0255, Det: 0.0711, Mat: 0.0567)
482
+ [Batch 850/1177] Loss: 0.1160 (Sem: 0.0188, Det: 0.0551, Mat: 0.0421)
483
+ [Batch 900/1177] Loss: 0.2222 (Sem: 0.0774, Det: 0.0670, Mat: 0.0778)
484
+ [Batch 950/1177] Loss: 0.1654 (Sem: 0.0374, Det: 0.0678, Mat: 0.0602)
485
+ [Batch 1000/1177] Loss: 0.2210 (Sem: 0.0828, Det: 0.0656, Mat: 0.0726)
486
+ [Batch 1050/1177] Loss: 0.2347 (Sem: 0.0402, Det: 0.1081, Mat: 0.0864)
487
+ [Batch 1100/1177] Loss: 0.1412 (Sem: 0.0288, Det: 0.0592, Mat: 0.0532)
488
+ [Batch 1150/1177] Loss: 0.1175 (Sem: 0.0195, Det: 0.0537, Mat: 0.0443)
489
+ [Epoch 15 Summary] Avg Loss: 0.1270 (Sem: 0.0266, Det: 0.0543, Mat: 0.0461)
490
+ [Validation] Running on 500 samples...
491
+ [Viz] Saved validation preview -> ./output/epoch_15_val.png
492
+ [Validation] Avg L1 Loss: 0.0086
493
+ [Checkpoint] Saved -> ./checkpoints/modnet_bn_epoch_15.ckpt
494
+ [Best Model] New best! Val Loss: 0.0086 -> ./checkpoints/modnet_bn_best.ckpt
495
+
496
+ ============================================================
497
+ BLOCK 1.2 TRAINING COMPLETE
498
+ Best Validation L1 Loss: 0.0086
499
+ Best checkpoint: ./checkpoints/modnet_bn_best.ckpt
500
+ Visual results: ./output/
501
+ ============================================================
photographic/finetune/output/epoch_01_val.png ADDED

Git LFS Details

  • SHA256: 609d9e06c1104b4219e1eece7ecd639b2152822ee2f7be492bee907350284ccb
  • Pointer size: 131 Bytes
  • Size of remote file: 470 kB
photographic/finetune/output/epoch_02_val.png ADDED

Git LFS Details

  • SHA256: 3c36ddd0dce856b7ed74c3101f523e6e513b307aa4da4b6732971e7eda3f0d44
  • Pointer size: 131 Bytes
  • Size of remote file: 468 kB
photographic/finetune/output/epoch_03_val.png ADDED

Git LFS Details

  • SHA256: eb6154064c661e246df75239c2365948d31910c5410c166df5f3c342e3d67c09
  • Pointer size: 131 Bytes
  • Size of remote file: 466 kB
photographic/finetune/output/epoch_04_val.png ADDED

Git LFS Details

  • SHA256: 3e0e719d1b975aabc52f23c56f6b788e9e100af2958ce5a06289f7e0121d335d
  • Pointer size: 131 Bytes
  • Size of remote file: 466 kB
photographic/finetune/output/epoch_05_val.png ADDED

Git LFS Details

  • SHA256: 76b77cdf1f85e3bd8ef582b33f6324fad7e4968e1087f342475cbb1d854868ac
  • Pointer size: 131 Bytes
  • Size of remote file: 464 kB
photographic/finetune/output/epoch_06_val.png ADDED

Git LFS Details

  • SHA256: 33c45cc6e3641142f78f07cfc06f6b3be594b28be06ccecb43daf378bcd29624
  • Pointer size: 131 Bytes
  • Size of remote file: 464 kB
photographic/finetune/output/epoch_07_val.png ADDED

Git LFS Details

  • SHA256: 63d64a61ce2f74bc95234390bd50401dce14b169ae6e69632b2a5b8209069024
  • Pointer size: 131 Bytes
  • Size of remote file: 465 kB
photographic/finetune/output/epoch_08_val.png ADDED

Git LFS Details

  • SHA256: bae5508f9de541ea5790c160a320b07326343dfff3e01171091003a4063d42a7
  • Pointer size: 131 Bytes
  • Size of remote file: 464 kB
photographic/finetune/output/epoch_09_val.png ADDED

Git LFS Details

  • SHA256: 69518d92a67338193a710d143da6e04a6c418d43648fa3b943984064b244a7a7
  • Pointer size: 131 Bytes
  • Size of remote file: 464 kB
photographic/finetune/output/epoch_10_val.png ADDED

Git LFS Details

  • SHA256: 956c8497919356af2781bfba33745a7d6aada6f772b9499f161505ee07a5345d
  • Pointer size: 131 Bytes
  • Size of remote file: 465 kB
photographic/finetune/output/epoch_11_val.png ADDED

Git LFS Details

  • SHA256: 1bd73f8d88f2d0890223a72e210c131592a3e8a93b68be00d31e6ce482951dfe
  • Pointer size: 131 Bytes
  • Size of remote file: 465 kB
photographic/finetune/output/epoch_12_val.png ADDED

Git LFS Details

  • SHA256: 62886847c2e7fe9fa6b29d25e1377bb0f920b7dfc4ff297ac79524890cd5afbb
  • Pointer size: 131 Bytes
  • Size of remote file: 464 kB
photographic/finetune/output/epoch_13_val.png ADDED

Git LFS Details

  • SHA256: 5f0a6a957bfc13a277cca8bebdc4bf2ad032883f46d5d3591ecb9d1a763a6787
  • Pointer size: 131 Bytes
  • Size of remote file: 465 kB
photographic/finetune/output/epoch_14_val.png ADDED

Git LFS Details

  • SHA256: d6339ca1d48e5c99d976fe3915bd9d34e67a2b97f3923c87091a022b6e556945
  • Pointer size: 131 Bytes
  • Size of remote file: 465 kB
photographic/finetune/output/epoch_15_val.png ADDED

Git LFS Details

  • SHA256: 6ab3c1de0e35c17f0f44ad8366db60033f7049027c5b7fde4a24df77128bbcf4
  • Pointer size: 131 Bytes
  • Size of remote file: 464 kB