segformer-finetuned-4ss1st3r_s3gs3m_24Jan_negro-10k-steps
This model is a fine-tuned version of nvidia/mit-b0 on the blzncz/4ss1st3r_s3gs3m_24Jan_negro dataset. It achieves the following results on the evaluation set:
- Loss: 0.2142
- Mean Iou: 0.5724
- Mean Accuracy: 0.7571
- Overall Accuracy: 0.9468
- Accuracy Bg: nan
- Accuracy Fallo cohesivo: 0.9826
- Accuracy Fallo malla: 0.7246
- Accuracy Fallo adhesivo: 0.9679
- Accuracy Fallo burbuja: 0.3533
- Iou Bg: 0.0
- Iou Fallo cohesivo: 0.9368
- Iou Fallo malla: 0.6678
- Iou Fallo adhesivo: 0.9310
- Iou Fallo burbuja: 0.3263
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- training_steps: 10000
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Bg | Accuracy Fallo cohesivo | Accuracy Fallo malla | Accuracy Fallo adhesivo | Accuracy Fallo burbuja | Iou Bg | Iou Fallo cohesivo | Iou Fallo malla | Iou Fallo adhesivo | Iou Fallo burbuja |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.1902 | 1.0 | 219 | 0.2615 | 0.5072 | 0.7247 | 0.9038 | nan | 0.9203 | 0.7991 | 0.9401 | 0.2393 | 0.0 | 0.8853 | 0.5556 | 0.9006 | 0.1944 |
0.1367 | 2.0 | 438 | 0.2067 | 0.5492 | 0.7602 | 0.9293 | nan | 0.9599 | 0.7487 | 0.9317 | 0.4004 | 0.0 | 0.9160 | 0.6120 | 0.8988 | 0.3195 |
0.1066 | 3.0 | 657 | 0.1963 | 0.5659 | 0.7814 | 0.9313 | nan | 0.9520 | 0.8022 | 0.9545 | 0.4169 | 0.0 | 0.9175 | 0.6270 | 0.9267 | 0.3584 |
0.1102 | 4.0 | 876 | 0.1595 | 0.5782 | 0.7756 | 0.9444 | nan | 0.9727 | 0.7669 | 0.9693 | 0.3934 | 0.0 | 0.9336 | 0.6828 | 0.9326 | 0.3422 |
0.1114 | 5.0 | 1095 | 0.1678 | 0.5772 | 0.7950 | 0.9378 | nan | 0.9619 | 0.7778 | 0.9756 | 0.4648 | 0.0 | 0.9255 | 0.6534 | 0.9151 | 0.3922 |
0.0897 | 6.0 | 1314 | 0.1726 | 0.5811 | 0.7976 | 0.9420 | nan | 0.9701 | 0.7598 | 0.9723 | 0.4881 | 0.0 | 0.9307 | 0.6613 | 0.9170 | 0.3965 |
0.0788 | 7.0 | 1533 | 0.2096 | 0.5491 | 0.7253 | 0.9342 | nan | 0.9898 | 0.5936 | 0.9381 | 0.3797 | 0.0 | 0.9235 | 0.5698 | 0.9149 | 0.3374 |
0.0788 | 8.0 | 1752 | 0.1574 | 0.5774 | 0.7733 | 0.9465 | nan | 0.9726 | 0.7858 | 0.9675 | 0.3673 | 0.0 | 0.9359 | 0.6914 | 0.9264 | 0.3331 |
0.0855 | 9.0 | 1971 | 0.1970 | 0.5406 | 0.7141 | 0.9380 | nan | 0.9866 | 0.6305 | 0.9708 | 0.2687 | 0.0 | 0.9274 | 0.5984 | 0.9224 | 0.2548 |
0.0761 | 10.0 | 2190 | 0.1903 | 0.5564 | 0.7479 | 0.9382 | nan | 0.9746 | 0.7050 | 0.9737 | 0.3383 | 0.0 | 0.9268 | 0.6272 | 0.9182 | 0.3098 |
0.0686 | 11.0 | 2409 | 0.1910 | 0.5562 | 0.7435 | 0.9393 | nan | 0.9827 | 0.6605 | 0.9738 | 0.3572 | 0.0 | 0.9285 | 0.6156 | 0.9209 | 0.3160 |
0.062 | 12.0 | 2628 | 0.2038 | 0.5453 | 0.7399 | 0.9334 | nan | 0.9728 | 0.6739 | 0.9811 | 0.3317 | 0.0 | 0.9214 | 0.6013 | 0.9035 | 0.3001 |
0.0586 | 13.0 | 2847 | 0.1914 | 0.5471 | 0.7342 | 0.9402 | nan | 0.9758 | 0.7103 | 0.9814 | 0.2693 | 0.0 | 0.9290 | 0.6397 | 0.9150 | 0.2517 |
0.0531 | 14.0 | 3066 | 0.1747 | 0.5716 | 0.7689 | 0.9449 | nan | 0.9701 | 0.7945 | 0.9588 | 0.3522 | 0.0 | 0.9339 | 0.6815 | 0.9280 | 0.3147 |
0.0522 | 15.0 | 3285 | 0.1933 | 0.5591 | 0.7399 | 0.9454 | nan | 0.9810 | 0.7222 | 0.9744 | 0.2820 | 0.0 | 0.9351 | 0.6603 | 0.9355 | 0.2645 |
0.059 | 16.0 | 3504 | 0.1897 | 0.5691 | 0.7878 | 0.9384 | nan | 0.9499 | 0.8594 | 0.9809 | 0.3608 | 0.0 | 0.9252 | 0.6741 | 0.9159 | 0.3303 |
0.0503 | 17.0 | 3723 | 0.1895 | 0.5652 | 0.7795 | 0.9365 | nan | 0.9588 | 0.7866 | 0.9808 | 0.3917 | 0.0 | 0.9238 | 0.6508 | 0.9004 | 0.3511 |
0.0518 | 18.0 | 3942 | 0.2131 | 0.5533 | 0.7332 | 0.9402 | nan | 0.9807 | 0.6877 | 0.9645 | 0.2998 | 0.0 | 0.9294 | 0.6248 | 0.9334 | 0.2790 |
0.0439 | 19.0 | 4161 | 0.2168 | 0.5565 | 0.7411 | 0.9388 | nan | 0.9801 | 0.6828 | 0.9567 | 0.3448 | 0.0 | 0.9278 | 0.6194 | 0.9234 | 0.3121 |
0.0459 | 20.0 | 4380 | 0.2688 | 0.5266 | 0.7127 | 0.9266 | nan | 0.9824 | 0.5567 | 0.9841 | 0.3277 | 0.0 | 0.9149 | 0.5329 | 0.8866 | 0.2987 |
0.043 | 21.0 | 4599 | 0.2395 | 0.5542 | 0.7409 | 0.9369 | nan | 0.9821 | 0.6444 | 0.9745 | 0.3625 | 0.0 | 0.9258 | 0.5974 | 0.9228 | 0.3248 |
0.0436 | 22.0 | 4818 | 0.1790 | 0.5736 | 0.7750 | 0.9441 | nan | 0.9706 | 0.7783 | 0.9694 | 0.3819 | 0.0 | 0.9331 | 0.6772 | 0.9143 | 0.3433 |
0.0443 | 23.0 | 5037 | 0.1843 | 0.5683 | 0.7613 | 0.9442 | nan | 0.9756 | 0.7470 | 0.9716 | 0.3511 | 0.0 | 0.9335 | 0.6684 | 0.9177 | 0.3219 |
0.0402 | 24.0 | 5256 | 0.2048 | 0.5666 | 0.7535 | 0.9429 | nan | 0.9800 | 0.7089 | 0.9706 | 0.3544 | 0.0 | 0.9324 | 0.6457 | 0.9302 | 0.3246 |
0.0399 | 25.0 | 5475 | 0.2102 | 0.5651 | 0.7524 | 0.9430 | nan | 0.9830 | 0.6875 | 0.9754 | 0.3637 | 0.0 | 0.9327 | 0.6412 | 0.9231 | 0.3287 |
0.0404 | 26.0 | 5694 | 0.1993 | 0.5792 | 0.7815 | 0.9460 | nan | 0.9690 | 0.8035 | 0.9697 | 0.3837 | 0.0 | 0.9351 | 0.6876 | 0.9289 | 0.3443 |
0.0388 | 27.0 | 5913 | 0.2024 | 0.5681 | 0.7501 | 0.9470 | nan | 0.9821 | 0.7343 | 0.9605 | 0.3236 | 0.0 | 0.9370 | 0.6715 | 0.9322 | 0.3001 |
0.0369 | 28.0 | 6132 | 0.1830 | 0.5701 | 0.7553 | 0.9481 | nan | 0.9779 | 0.7698 | 0.9608 | 0.3126 | 0.0 | 0.9379 | 0.6871 | 0.9323 | 0.2931 |
0.0373 | 29.0 | 6351 | 0.2162 | 0.5682 | 0.7535 | 0.9438 | nan | 0.9828 | 0.7011 | 0.9639 | 0.3665 | 0.0 | 0.9335 | 0.6482 | 0.9239 | 0.3352 |
0.0348 | 30.0 | 6570 | 0.2126 | 0.5640 | 0.7479 | 0.9435 | nan | 0.9813 | 0.7097 | 0.9623 | 0.3384 | 0.0 | 0.9330 | 0.6537 | 0.9197 | 0.3135 |
0.0354 | 31.0 | 6789 | 0.2025 | 0.5626 | 0.7467 | 0.9469 | nan | 0.9795 | 0.7453 | 0.9725 | 0.2896 | 0.0 | 0.9368 | 0.6762 | 0.9285 | 0.2716 |
0.0344 | 32.0 | 7008 | 0.1973 | 0.5786 | 0.7739 | 0.9469 | nan | 0.9734 | 0.7828 | 0.9698 | 0.3695 | 0.0 | 0.9364 | 0.6853 | 0.9326 | 0.3389 |
0.0333 | 33.0 | 7227 | 0.2199 | 0.5722 | 0.7624 | 0.9438 | nan | 0.9817 | 0.7045 | 0.9696 | 0.3940 | 0.0 | 0.9334 | 0.6481 | 0.9287 | 0.3510 |
0.0345 | 34.0 | 7446 | 0.2052 | 0.5791 | 0.7724 | 0.9465 | nan | 0.9799 | 0.7347 | 0.9736 | 0.4015 | 0.0 | 0.9363 | 0.6698 | 0.9311 | 0.3582 |
0.0326 | 35.0 | 7665 | 0.2176 | 0.5758 | 0.7629 | 0.9462 | nan | 0.9835 | 0.7124 | 0.9689 | 0.3868 | 0.0 | 0.9362 | 0.6595 | 0.9345 | 0.3490 |
0.034 | 36.0 | 7884 | 0.2247 | 0.5717 | 0.7557 | 0.9453 | nan | 0.9841 | 0.7033 | 0.9661 | 0.3694 | 0.0 | 0.9352 | 0.6533 | 0.9331 | 0.3369 |
0.0324 | 37.0 | 8103 | 0.1957 | 0.5797 | 0.7736 | 0.9490 | nan | 0.9763 | 0.7801 | 0.9725 | 0.3657 | 0.0 | 0.9390 | 0.6963 | 0.9299 | 0.3333 |
0.0332 | 38.0 | 8322 | 0.1996 | 0.5770 | 0.7644 | 0.9478 | nan | 0.9826 | 0.7310 | 0.9696 | 0.3743 | 0.0 | 0.9379 | 0.6741 | 0.9336 | 0.3393 |
0.0332 | 39.0 | 8541 | 0.2129 | 0.5638 | 0.7423 | 0.9449 | nan | 0.9845 | 0.7021 | 0.9616 | 0.3212 | 0.0 | 0.9348 | 0.6514 | 0.9328 | 0.3001 |
0.03 | 40.0 | 8760 | 0.2283 | 0.5694 | 0.7539 | 0.9441 | nan | 0.9840 | 0.6931 | 0.9686 | 0.3699 | 0.0 | 0.9339 | 0.6464 | 0.9277 | 0.3387 |
0.0319 | 41.0 | 8979 | 0.2013 | 0.5741 | 0.7624 | 0.9471 | nan | 0.9804 | 0.7416 | 0.9670 | 0.3606 | 0.0 | 0.9370 | 0.6760 | 0.9277 | 0.3300 |
0.0361 | 42.0 | 9198 | 0.2094 | 0.5709 | 0.7568 | 0.9463 | nan | 0.9810 | 0.7317 | 0.9663 | 0.3483 | 0.0 | 0.9362 | 0.6689 | 0.9279 | 0.3216 |
0.0304 | 43.0 | 9417 | 0.2098 | 0.5731 | 0.7586 | 0.9468 | nan | 0.9821 | 0.7282 | 0.9666 | 0.3575 | 0.0 | 0.9368 | 0.6700 | 0.9295 | 0.3293 |
0.0303 | 44.0 | 9636 | 0.2155 | 0.5705 | 0.7554 | 0.9470 | nan | 0.9814 | 0.7329 | 0.9702 | 0.3370 | 0.0 | 0.9369 | 0.6718 | 0.9301 | 0.3137 |
0.03 | 45.0 | 9855 | 0.2183 | 0.5703 | 0.7541 | 0.9464 | nan | 0.9825 | 0.7229 | 0.9677 | 0.3435 | 0.0 | 0.9364 | 0.6657 | 0.9311 | 0.3181 |
0.0301 | 45.66 | 10000 | 0.2142 | 0.5724 | 0.7571 | 0.9468 | nan | 0.9826 | 0.7246 | 0.9679 | 0.3533 | 0.0 | 0.9368 | 0.6678 | 0.9310 | 0.3263 |
Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cpu
- Datasets 2.13.1
- Tokenizers 0.13.3
- Downloads last month
- 3