metadata
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b1-solarModuleAnomaly-v0.1
results: []
segformer-b1-solarModuleAnomaly-v0.1
This model is a fine-tuned version of nvidia/mit-b1 on the zklee98/solarModuleAnomaly dataset. It achieves the following results on the evaluation set:
- Loss: 0.1547
- Mean Iou: 0.3822
- Mean Accuracy: 0.7643
- Overall Accuracy: 0.7643
- Accuracy Unlabelled: nan
- Accuracy Anomaly: 0.7643
- Iou Unlabelled: 0.0
- Iou Anomaly: 0.7643
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabelled | Accuracy Anomaly | Iou Unlabelled | Iou Anomaly |
---|---|---|---|---|---|---|---|---|---|---|
0.4699 | 0.4 | 20 | 0.6337 | 0.4581 | 0.9162 | 0.9162 | nan | 0.9162 | 0.0 | 0.9162 |
0.3129 | 0.8 | 40 | 0.4636 | 0.3704 | 0.7407 | 0.7407 | nan | 0.7407 | 0.0 | 0.7407 |
0.2732 | 1.2 | 60 | 0.3164 | 0.3867 | 0.7734 | 0.7734 | nan | 0.7734 | 0.0 | 0.7734 |
0.2653 | 1.6 | 80 | 0.3769 | 0.4090 | 0.8180 | 0.8180 | nan | 0.8180 | 0.0 | 0.8180 |
0.2232 | 2.0 | 100 | 0.2976 | 0.2479 | 0.4958 | 0.4958 | nan | 0.4958 | 0.0 | 0.4958 |
0.5305 | 2.4 | 120 | 0.3151 | 0.3807 | 0.7613 | 0.7613 | nan | 0.7613 | 0.0 | 0.7613 |
0.2423 | 2.8 | 140 | 0.3189 | 0.4152 | 0.8305 | 0.8305 | nan | 0.8305 | 0.0 | 0.8305 |
0.3341 | 3.2 | 160 | 0.2384 | 0.3861 | 0.7723 | 0.7723 | nan | 0.7723 | 0.0 | 0.7723 |
0.2146 | 3.6 | 180 | 0.3200 | 0.4621 | 0.9243 | 0.9243 | nan | 0.9243 | 0.0 | 0.9243 |
0.1866 | 4.0 | 200 | 0.2510 | 0.3646 | 0.7291 | 0.7291 | nan | 0.7291 | 0.0 | 0.7291 |
0.2861 | 4.4 | 220 | 0.2736 | 0.4202 | 0.8404 | 0.8404 | nan | 0.8404 | 0.0 | 0.8404 |
0.2048 | 4.8 | 240 | 0.2410 | 0.3912 | 0.7823 | 0.7823 | nan | 0.7823 | 0.0 | 0.7823 |
0.1604 | 5.2 | 260 | 0.2233 | 0.3672 | 0.7344 | 0.7344 | nan | 0.7344 | 0.0 | 0.7344 |
0.2756 | 5.6 | 280 | 0.2705 | 0.4494 | 0.8987 | 0.8987 | nan | 0.8987 | 0.0 | 0.8987 |
0.1859 | 6.0 | 300 | 0.2211 | 0.4045 | 0.8089 | 0.8089 | nan | 0.8089 | 0.0 | 0.8089 |
0.1306 | 6.4 | 320 | 0.2140 | 0.3763 | 0.7525 | 0.7525 | nan | 0.7525 | 0.0 | 0.7525 |
0.5508 | 6.8 | 340 | 0.2231 | 0.4185 | 0.8371 | 0.8371 | nan | 0.8371 | 0.0 | 0.8371 |
0.1446 | 7.2 | 360 | 0.2139 | 0.3666 | 0.7332 | 0.7332 | nan | 0.7332 | 0.0 | 0.7332 |
0.3275 | 7.6 | 380 | 0.2470 | 0.3964 | 0.7928 | 0.7928 | nan | 0.7928 | 0.0 | 0.7928 |
0.164 | 8.0 | 400 | 0.2017 | 0.3910 | 0.7819 | 0.7819 | nan | 0.7819 | 0.0 | 0.7819 |
0.1864 | 8.4 | 420 | 0.2307 | 0.4408 | 0.8816 | 0.8816 | nan | 0.8816 | 0.0 | 0.8816 |
0.1578 | 8.8 | 440 | 0.1869 | 0.3707 | 0.7414 | 0.7414 | nan | 0.7414 | 0.0 | 0.7414 |
0.1201 | 9.2 | 460 | 0.2115 | 0.3834 | 0.7667 | 0.7667 | nan | 0.7667 | 0.0 | 0.7667 |
0.1783 | 9.6 | 480 | 0.2009 | 0.3747 | 0.7495 | 0.7495 | nan | 0.7495 | 0.0 | 0.7495 |
0.1232 | 10.0 | 500 | 0.1797 | 0.3865 | 0.7729 | 0.7729 | nan | 0.7729 | 0.0 | 0.7729 |
0.2572 | 10.4 | 520 | 0.1983 | 0.4057 | 0.8115 | 0.8115 | nan | 0.8115 | 0.0 | 0.8115 |
0.1209 | 10.8 | 540 | 0.1607 | 0.4274 | 0.8547 | 0.8547 | nan | 0.8547 | 0.0 | 0.8547 |
0.1234 | 11.2 | 560 | 0.2260 | 0.4066 | 0.8133 | 0.8133 | nan | 0.8133 | 0.0 | 0.8133 |
0.145 | 11.6 | 580 | 0.1963 | 0.3939 | 0.7878 | 0.7878 | nan | 0.7878 | 0.0 | 0.7878 |
0.0665 | 12.0 | 600 | 0.1912 | 0.3873 | 0.7747 | 0.7747 | nan | 0.7747 | 0.0 | 0.7747 |
0.0826 | 12.4 | 620 | 0.2095 | 0.4186 | 0.8373 | 0.8373 | nan | 0.8373 | 0.0 | 0.8373 |
0.1212 | 12.8 | 640 | 0.1732 | 0.4059 | 0.8118 | 0.8118 | nan | 0.8118 | 0.0 | 0.8118 |
0.142 | 13.2 | 660 | 0.2086 | 0.4007 | 0.8013 | 0.8013 | nan | 0.8013 | 0.0 | 0.8013 |
0.0899 | 13.6 | 680 | 0.1838 | 0.3928 | 0.7856 | 0.7856 | nan | 0.7856 | 0.0 | 0.7856 |
0.1857 | 14.0 | 700 | 0.1638 | 0.4157 | 0.8315 | 0.8315 | nan | 0.8315 | 0.0 | 0.8315 |
0.0788 | 14.4 | 720 | 0.1736 | 0.4112 | 0.8223 | 0.8223 | nan | 0.8223 | 0.0 | 0.8223 |
0.2543 | 14.8 | 740 | 0.1547 | 0.3822 | 0.7643 | 0.7643 | nan | 0.7643 | 0.0 | 0.7643 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3