SatwikKambham commited on
Commit
ae38dc2
1 Parent(s): 7cafd2a

End of training

Browse files
Files changed (4) hide show
  1. README.md +172 -0
  2. config.json +90 -0
  3. pytorch_model.bin +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,172 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ base_model: nvidia/mit-b0
4
+ tags:
5
+ - vision
6
+ - image-segmentation
7
+ - generated_from_trainer
8
+ datasets:
9
+ - ex_dark
10
+ model-index:
11
+ - name: segformer-b0-finetuned-suim
12
+ results: []
13
+ ---
14
+
15
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
+ should probably proofread and complete it, then remove this comment. -->
17
+
18
+ # segformer-b0-finetuned-suim
19
+
20
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the SatwikKambham/suim dataset.
21
+ It achieves the following results on the evaluation set:
22
+ - Loss: 0.5044
23
+ - Mean Iou: 0.6138
24
+ - Mean Accuracy: 0.7713
25
+ - Overall Accuracy: 0.8213
26
+ - Accuracy Background (waterbody): nan
27
+ - Accuracy Human divers: 0.9139
28
+ - Accuracy Aquatic plants and sea-grass: 0.2842
29
+ - Accuracy Wrecks and ruins: 0.8156
30
+ - Accuracy Robots (auvs/rovs/instruments): 0.8117
31
+ - Accuracy Reefs and invertebrates: 0.9098
32
+ - Accuracy Fish and vertebrates: 0.8540
33
+ - Accuracy Sea-floor and rocks: 0.8096
34
+ - Iou Background (waterbody): 0.0
35
+ - Iou Human divers: 0.8428
36
+ - Iou Aquatic plants and sea-grass: 0.2638
37
+ - Iou Wrecks and ruins: 0.7560
38
+ - Iou Robots (auvs/rovs/instruments): 0.7896
39
+ - Iou Reefs and invertebrates: 0.7482
40
+ - Iou Fish and vertebrates: 0.7927
41
+ - Iou Sea-floor and rocks: 0.7176
42
+
43
+ ## Model description
44
+
45
+ More information needed
46
+
47
+ ## Intended uses & limitations
48
+
49
+ More information needed
50
+
51
+ ## Training and evaluation data
52
+
53
+ More information needed
54
+
55
+ ## Training procedure
56
+
57
+ ### Training hyperparameters
58
+
59
+ The following hyperparameters were used during training:
60
+ - learning_rate: 5e-05
61
+ - train_batch_size: 8
62
+ - eval_batch_size: 8
63
+ - seed: 42
64
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
65
+ - lr_scheduler_type: linear
66
+ - num_epochs: 50
67
+
68
+ ### Training results
69
+
70
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background (waterbody) | Accuracy Human divers | Accuracy Aquatic plants and sea-grass | Accuracy Wrecks and ruins | Accuracy Robots (auvs/rovs/instruments) | Accuracy Reefs and invertebrates | Accuracy Fish and vertebrates | Accuracy Sea-floor and rocks | Iou Background (waterbody) | Iou Human divers | Iou Aquatic plants and sea-grass | Iou Wrecks and ruins | Iou Robots (auvs/rovs/instruments) | Iou Reefs and invertebrates | Iou Fish and vertebrates | Iou Sea-floor and rocks |
71
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------------------:|:---------------------:|:-------------------------------------:|:-------------------------:|:---------------------------------------:|:--------------------------------:|:-----------------------------:|:----------------------------:|:--------------------------:|:----------------:|:--------------------------------:|:--------------------:|:----------------------------------:|:---------------------------:|:------------------------:|:-----------------------:|
72
+ | 1.1911 | 0.54 | 100 | 0.9270 | 0.3052 | 0.4434 | 0.6401 | nan | 0.2785 | 0.0 | 0.6857 | 0.0 | 0.7643 | 0.7044 | 0.6709 | 0.0 | 0.2555 | 0.0 | 0.4605 | 0.0 | 0.6111 | 0.5394 | 0.5753 |
73
+ | 0.6852 | 1.08 | 200 | 0.7299 | 0.3457 | 0.4967 | 0.7065 | nan | 0.3429 | 0.0 | 0.8033 | 0.0 | 0.7829 | 0.7653 | 0.7826 | 0.0 | 0.3126 | 0.0 | 0.5458 | 0.0 | 0.6336 | 0.5961 | 0.6776 |
74
+ | 0.854 | 1.61 | 300 | 0.7453 | 0.3420 | 0.4937 | 0.6506 | nan | 0.5488 | 0.0 | 0.7778 | 0.0 | 0.6528 | 0.7659 | 0.7103 | 0.0 | 0.4867 | 0.0 | 0.5156 | 0.0 | 0.5619 | 0.6180 | 0.5536 |
75
+ | 0.5111 | 2.15 | 400 | 0.6174 | 0.3920 | 0.5455 | 0.7207 | nan | 0.7475 | 0.0 | 0.6988 | 0.0 | 0.8364 | 0.7907 | 0.7449 | 0.0 | 0.5704 | 0.0 | 0.5986 | 0.0 | 0.6422 | 0.6796 | 0.6451 |
76
+ | 0.5311 | 2.69 | 500 | 0.5811 | 0.4084 | 0.5606 | 0.7368 | nan | 0.7697 | 0.0 | 0.7729 | 0.0 | 0.8658 | 0.7824 | 0.7333 | 0.0 | 0.6522 | 0.0 | 0.6000 | 0.0 | 0.6551 | 0.7045 | 0.6555 |
77
+ | 0.5595 | 3.23 | 600 | 0.5591 | 0.4165 | 0.5686 | 0.7420 | nan | 0.8291 | 0.0 | 0.7557 | 0.0 | 0.8848 | 0.7805 | 0.7299 | 0.0 | 0.6524 | 0.0 | 0.6536 | 0.0 | 0.6760 | 0.7011 | 0.6489 |
78
+ | 0.6847 | 3.76 | 700 | 0.5657 | 0.4068 | 0.5740 | 0.7336 | nan | 0.8233 | 0.0 | 0.8490 | 0.0 | 0.8649 | 0.8157 | 0.6648 | 0.0 | 0.6575 | 0.0 | 0.5982 | 0.0 | 0.6532 | 0.7179 | 0.6277 |
79
+ | 0.3272 | 4.3 | 800 | 0.5065 | 0.4309 | 0.5838 | 0.7573 | nan | 0.7906 | 0.0 | 0.8117 | 0.0467 | 0.8729 | 0.8021 | 0.7629 | 0.0 | 0.6564 | 0.0 | 0.6608 | 0.0467 | 0.6666 | 0.7242 | 0.6925 |
80
+ | 0.5065 | 4.84 | 900 | 0.4745 | 0.4420 | 0.5978 | 0.7675 | nan | 0.7969 | 0.0000 | 0.8235 | 0.0879 | 0.8301 | 0.8187 | 0.8273 | 0.0 | 0.6668 | 0.0000 | 0.6834 | 0.0879 | 0.6847 | 0.7247 | 0.6882 |
81
+ | 0.3712 | 5.38 | 1000 | 0.4567 | 0.4691 | 0.6296 | 0.7824 | nan | 0.8468 | 0.0004 | 0.7933 | 0.2299 | 0.9005 | 0.8431 | 0.7929 | 0.0 | 0.6829 | 0.0004 | 0.7036 | 0.2298 | 0.6852 | 0.7381 | 0.7125 |
82
+ | 0.6866 | 5.91 | 1100 | 0.4453 | 0.5352 | 0.6932 | 0.7843 | nan | 0.8487 | 0.0236 | 0.8462 | 0.6450 | 0.8373 | 0.8198 | 0.8317 | 0.0 | 0.7810 | 0.0227 | 0.7122 | 0.6437 | 0.6768 | 0.7454 | 0.7001 |
83
+ | 0.4374 | 6.45 | 1200 | 0.4806 | 0.5279 | 0.6836 | 0.7705 | nan | 0.8392 | 0.1452 | 0.7508 | 0.5700 | 0.8615 | 0.8452 | 0.7732 | 0.0 | 0.7349 | 0.1403 | 0.6883 | 0.5638 | 0.6990 | 0.7439 | 0.6532 |
84
+ | 0.5409 | 6.99 | 1300 | 0.4671 | 0.5403 | 0.6987 | 0.7768 | nan | 0.8636 | 0.0181 | 0.7956 | 0.7149 | 0.8739 | 0.8502 | 0.7746 | 0.0 | 0.7666 | 0.0179 | 0.7256 | 0.7072 | 0.6748 | 0.7473 | 0.6834 |
85
+ | 0.3526 | 7.53 | 1400 | 0.4691 | 0.5517 | 0.7168 | 0.7811 | nan | 0.8670 | 0.2005 | 0.8345 | 0.6704 | 0.8837 | 0.8260 | 0.7353 | 0.0 | 0.7791 | 0.1891 | 0.6723 | 0.6687 | 0.6845 | 0.7447 | 0.6752 |
86
+ | 0.2883 | 8.06 | 1500 | 0.4418 | 0.5744 | 0.7341 | 0.8013 | nan | 0.8526 | 0.2870 | 0.7820 | 0.6996 | 0.8971 | 0.8211 | 0.7993 | 0.0 | 0.7654 | 0.2764 | 0.7106 | 0.6901 | 0.6891 | 0.7557 | 0.7081 |
87
+ | 0.645 | 8.6 | 1600 | 0.4597 | 0.5730 | 0.7375 | 0.7945 | nan | 0.8874 | 0.1758 | 0.8621 | 0.7514 | 0.8888 | 0.8442 | 0.7530 | 0.0 | 0.8018 | 0.1683 | 0.7251 | 0.7413 | 0.7048 | 0.7556 | 0.6869 |
88
+ | 0.2953 | 9.14 | 1700 | 0.4279 | 0.5967 | 0.7596 | 0.8068 | nan | 0.8210 | 0.4181 | 0.8465 | 0.7328 | 0.8293 | 0.8410 | 0.8285 | 0.0 | 0.7607 | 0.3735 | 0.7601 | 0.7259 | 0.7098 | 0.7495 | 0.6944 |
89
+ | 0.3441 | 9.68 | 1800 | 0.4727 | 0.5662 | 0.7387 | 0.7701 | nan | 0.8660 | 0.2776 | 0.8603 | 0.7675 | 0.6688 | 0.8444 | 0.8867 | 0.0 | 0.7965 | 0.2577 | 0.7043 | 0.7602 | 0.6180 | 0.7374 | 0.6556 |
90
+ | 0.4354 | 10.22 | 1900 | 0.4499 | 0.5868 | 0.7466 | 0.8000 | nan | 0.7931 | 0.3488 | 0.8160 | 0.7521 | 0.8689 | 0.8599 | 0.7876 | 0.0 | 0.7399 | 0.3263 | 0.7469 | 0.7342 | 0.7090 | 0.7542 | 0.6836 |
91
+ | 0.3476 | 10.75 | 2000 | 0.4732 | 0.5468 | 0.7053 | 0.7802 | nan | 0.8976 | 0.0416 | 0.8285 | 0.6882 | 0.8939 | 0.8447 | 0.7426 | 0.0 | 0.8055 | 0.0402 | 0.7180 | 0.6837 | 0.6855 | 0.7632 | 0.6781 |
92
+ | 0.3416 | 11.29 | 2100 | 0.4553 | 0.5745 | 0.7342 | 0.7949 | nan | 0.8803 | 0.1754 | 0.7792 | 0.7736 | 0.9134 | 0.8551 | 0.7625 | 0.0 | 0.8009 | 0.1632 | 0.7275 | 0.7484 | 0.6939 | 0.7627 | 0.6996 |
93
+ | 0.157 | 11.83 | 2200 | 0.4684 | 0.5814 | 0.7432 | 0.7927 | nan | 0.8915 | 0.2169 | 0.8390 | 0.7693 | 0.8894 | 0.8532 | 0.7434 | 0.0 | 0.8147 | 0.2091 | 0.7275 | 0.7518 | 0.7105 | 0.7707 | 0.6666 |
94
+ | 0.1665 | 12.37 | 2300 | 0.4369 | 0.6031 | 0.7710 | 0.8076 | nan | 0.9014 | 0.3852 | 0.8488 | 0.7526 | 0.8590 | 0.8721 | 0.7780 | 0.0 | 0.8264 | 0.3500 | 0.7249 | 0.7403 | 0.7166 | 0.7678 | 0.6990 |
95
+ | 0.3426 | 12.9 | 2400 | 0.4458 | 0.5814 | 0.7495 | 0.7925 | nan | 0.8842 | 0.2608 | 0.8393 | 0.7845 | 0.8458 | 0.8491 | 0.7826 | 0.0 | 0.8201 | 0.2433 | 0.6979 | 0.7533 | 0.6987 | 0.7597 | 0.6779 |
96
+ | 0.2268 | 13.44 | 2500 | 0.4612 | 0.5776 | 0.7355 | 0.7929 | nan | 0.8722 | 0.1872 | 0.7989 | 0.7789 | 0.8675 | 0.8461 | 0.7978 | 0.0 | 0.7964 | 0.1755 | 0.7478 | 0.7551 | 0.7139 | 0.7654 | 0.6665 |
97
+ | 0.3108 | 13.98 | 2600 | 0.4449 | 0.5926 | 0.7490 | 0.8112 | nan | 0.8756 | 0.2111 | 0.8021 | 0.7897 | 0.8921 | 0.8473 | 0.8256 | 0.0 | 0.8106 | 0.2007 | 0.7520 | 0.7736 | 0.7287 | 0.7737 | 0.7017 |
98
+ | 0.1832 | 14.52 | 2700 | 0.4271 | 0.6254 | 0.7965 | 0.8277 | nan | 0.8930 | 0.5784 | 0.7930 | 0.7627 | 0.8712 | 0.8446 | 0.8323 | 0.0 | 0.8085 | 0.4667 | 0.7385 | 0.7489 | 0.7402 | 0.7689 | 0.7312 |
99
+ | 0.1784 | 15.05 | 2800 | 0.4531 | 0.5858 | 0.7448 | 0.7987 | nan | 0.9073 | 0.2227 | 0.7956 | 0.7764 | 0.8722 | 0.8323 | 0.8069 | 0.0 | 0.8232 | 0.2077 | 0.7337 | 0.7556 | 0.7050 | 0.7734 | 0.6880 |
100
+ | 0.2532 | 15.59 | 2900 | 0.4925 | 0.5712 | 0.7273 | 0.7948 | nan | 0.9059 | 0.0955 | 0.8082 | 0.7507 | 0.9018 | 0.8558 | 0.7736 | 0.0 | 0.8223 | 0.0914 | 0.7457 | 0.7283 | 0.7327 | 0.7813 | 0.6678 |
101
+ | 0.2804 | 16.13 | 3000 | 0.4406 | 0.6236 | 0.7967 | 0.8169 | nan | 0.9282 | 0.5433 | 0.8292 | 0.7638 | 0.8782 | 0.8735 | 0.7609 | 0.0 | 0.8340 | 0.4822 | 0.7338 | 0.7438 | 0.7286 | 0.7710 | 0.6958 |
102
+ | 0.2874 | 16.67 | 3100 | 0.4576 | 0.5876 | 0.7474 | 0.8073 | nan | 0.8891 | 0.1620 | 0.8209 | 0.7999 | 0.8955 | 0.8642 | 0.8001 | 0.0 | 0.8199 | 0.1544 | 0.7478 | 0.7678 | 0.7430 | 0.7774 | 0.6904 |
103
+ | 0.2731 | 17.2 | 3200 | 0.4212 | 0.6263 | 0.7914 | 0.8341 | nan | 0.9025 | 0.5544 | 0.7921 | 0.7143 | 0.9193 | 0.8514 | 0.8057 | 0.0 | 0.8120 | 0.4920 | 0.7327 | 0.6956 | 0.7547 | 0.7783 | 0.7447 |
104
+ | 0.1974 | 17.74 | 3300 | 0.4423 | 0.6215 | 0.7846 | 0.8138 | nan | 0.8722 | 0.4908 | 0.8068 | 0.8134 | 0.8748 | 0.8356 | 0.7988 | 0.0 | 0.8076 | 0.4451 | 0.7482 | 0.7737 | 0.7350 | 0.7671 | 0.6953 |
105
+ | 0.1833 | 18.28 | 3400 | 0.4207 | 0.6284 | 0.7944 | 0.8250 | nan | 0.8809 | 0.4986 | 0.8445 | 0.7943 | 0.8829 | 0.8643 | 0.7955 | 0.0 | 0.8206 | 0.4449 | 0.7446 | 0.7776 | 0.7421 | 0.7729 | 0.7244 |
106
+ | 0.1611 | 18.82 | 3500 | 0.4327 | 0.6085 | 0.7733 | 0.8244 | nan | 0.8563 | 0.4147 | 0.7980 | 0.7706 | 0.9201 | 0.8495 | 0.8041 | 0.0 | 0.7886 | 0.3657 | 0.7429 | 0.7326 | 0.7438 | 0.7712 | 0.7230 |
107
+ | 0.1339 | 19.35 | 3600 | 0.4795 | 0.5796 | 0.7369 | 0.7978 | nan | 0.8572 | 0.1929 | 0.7776 | 0.7929 | 0.9016 | 0.8480 | 0.7882 | 0.0 | 0.7934 | 0.1840 | 0.7334 | 0.7498 | 0.7296 | 0.7615 | 0.6848 |
108
+ | 0.1805 | 19.89 | 3700 | 0.4722 | 0.6137 | 0.7739 | 0.8134 | nan | 0.8924 | 0.3871 | 0.8223 | 0.7928 | 0.8949 | 0.8441 | 0.7836 | 0.0 | 0.8249 | 0.3603 | 0.7521 | 0.7557 | 0.7451 | 0.7728 | 0.6990 |
109
+ | 0.173 | 20.43 | 3800 | 0.4495 | 0.6170 | 0.7843 | 0.8230 | nan | 0.9220 | 0.3516 | 0.8189 | 0.8353 | 0.9090 | 0.8546 | 0.7988 | 0.0 | 0.8318 | 0.3314 | 0.7584 | 0.7729 | 0.7453 | 0.7761 | 0.7202 |
110
+ | 0.2476 | 20.97 | 3900 | 0.4426 | 0.6363 | 0.8007 | 0.8331 | nan | 0.9142 | 0.4724 | 0.8094 | 0.8448 | 0.9178 | 0.8330 | 0.8136 | 0.0 | 0.8485 | 0.4259 | 0.7481 | 0.8003 | 0.7454 | 0.7840 | 0.7380 |
111
+ | 0.3163 | 21.51 | 4000 | 0.4550 | 0.6337 | 0.8023 | 0.8273 | nan | 0.8808 | 0.5716 | 0.8209 | 0.8055 | 0.9199 | 0.8518 | 0.7655 | 0.0 | 0.8129 | 0.4923 | 0.7503 | 0.7815 | 0.7402 | 0.7748 | 0.7180 |
112
+ | 0.12 | 22.04 | 4100 | 0.4396 | 0.6142 | 0.7772 | 0.8249 | nan | 0.9254 | 0.2952 | 0.8118 | 0.8303 | 0.9143 | 0.8460 | 0.8174 | 0.0 | 0.8363 | 0.2724 | 0.7579 | 0.7875 | 0.7491 | 0.7787 | 0.7316 |
113
+ | 0.2351 | 22.58 | 4200 | 0.4622 | 0.6020 | 0.7612 | 0.8087 | nan | 0.8992 | 0.2520 | 0.8300 | 0.8150 | 0.8647 | 0.8478 | 0.8200 | 0.0 | 0.8366 | 0.2309 | 0.7655 | 0.7912 | 0.7260 | 0.7714 | 0.6945 |
114
+ | 0.106 | 23.12 | 4300 | 0.4570 | 0.6001 | 0.7624 | 0.8163 | nan | 0.8731 | 0.2566 | 0.7931 | 0.8358 | 0.8961 | 0.8514 | 0.8306 | 0.0 | 0.8110 | 0.2326 | 0.7431 | 0.7773 | 0.7413 | 0.7771 | 0.7182 |
115
+ | 0.2408 | 23.66 | 4400 | 0.4795 | 0.5944 | 0.7501 | 0.8138 | nan | 0.9040 | 0.1740 | 0.8437 | 0.7707 | 0.8980 | 0.8486 | 0.8115 | 0.0 | 0.8454 | 0.1631 | 0.7700 | 0.7579 | 0.7352 | 0.7742 | 0.7091 |
116
+ | 0.1285 | 24.19 | 4500 | 0.4311 | 0.6377 | 0.8055 | 0.8339 | nan | 0.9053 | 0.5473 | 0.8025 | 0.8113 | 0.9093 | 0.8567 | 0.8060 | 0.0 | 0.8279 | 0.4738 | 0.7405 | 0.7775 | 0.7588 | 0.7855 | 0.7380 |
117
+ | 0.2967 | 24.73 | 4600 | 0.4509 | 0.6273 | 0.7913 | 0.8209 | nan | 0.8998 | 0.4427 | 0.8456 | 0.8264 | 0.8666 | 0.8433 | 0.8146 | 0.0 | 0.8437 | 0.4058 | 0.7616 | 0.7873 | 0.7382 | 0.7723 | 0.7098 |
118
+ | 0.2716 | 25.27 | 4700 | 0.4708 | 0.6149 | 0.7815 | 0.8199 | nan | 0.8998 | 0.3343 | 0.8176 | 0.8528 | 0.9088 | 0.8668 | 0.7906 | 0.0 | 0.8345 | 0.3116 | 0.7546 | 0.7633 | 0.7474 | 0.7928 | 0.7152 |
119
+ | 0.2685 | 25.81 | 4800 | 0.4688 | 0.6166 | 0.7778 | 0.8135 | nan | 0.9177 | 0.3472 | 0.8437 | 0.8130 | 0.9059 | 0.8572 | 0.7598 | 0.0 | 0.8452 | 0.3205 | 0.7512 | 0.7864 | 0.7475 | 0.7879 | 0.6937 |
120
+ | 0.1678 | 26.34 | 4900 | 0.4833 | 0.6056 | 0.7701 | 0.8111 | nan | 0.9324 | 0.2775 | 0.8162 | 0.8232 | 0.8679 | 0.8571 | 0.8162 | 0.0 | 0.8521 | 0.2610 | 0.7384 | 0.7874 | 0.7107 | 0.7817 | 0.7138 |
121
+ | 0.1904 | 26.88 | 5000 | 0.4423 | 0.6367 | 0.8050 | 0.8313 | nan | 0.9204 | 0.5078 | 0.8254 | 0.8175 | 0.8890 | 0.8661 | 0.8091 | 0.0 | 0.8452 | 0.4385 | 0.7538 | 0.7841 | 0.7505 | 0.7884 | 0.7333 |
122
+ | 0.202 | 27.42 | 5100 | 0.4582 | 0.6143 | 0.7755 | 0.8206 | nan | 0.8995 | 0.2772 | 0.8510 | 0.8392 | 0.8871 | 0.8583 | 0.8159 | 0.0 | 0.8469 | 0.2557 | 0.7571 | 0.8076 | 0.7329 | 0.7852 | 0.7286 |
123
+ | 0.1242 | 27.96 | 5200 | 0.4692 | 0.6143 | 0.7766 | 0.8192 | nan | 0.8887 | 0.3377 | 0.8350 | 0.8156 | 0.8718 | 0.8649 | 0.8222 | 0.0 | 0.8302 | 0.3055 | 0.7600 | 0.7944 | 0.7255 | 0.7826 | 0.7164 |
124
+ | 0.1594 | 28.49 | 5300 | 0.4588 | 0.6187 | 0.7860 | 0.8190 | nan | 0.9347 | 0.3829 | 0.8410 | 0.8051 | 0.8479 | 0.8606 | 0.8297 | 0.0 | 0.8549 | 0.3390 | 0.7618 | 0.7801 | 0.7272 | 0.7862 | 0.7004 |
125
+ | 0.1414 | 29.03 | 5400 | 0.4591 | 0.6293 | 0.7923 | 0.8344 | nan | 0.9064 | 0.5202 | 0.8183 | 0.7327 | 0.9143 | 0.8469 | 0.8073 | 0.0 | 0.8323 | 0.4441 | 0.7449 | 0.7207 | 0.7524 | 0.7906 | 0.7496 |
126
+ | 0.1208 | 29.57 | 5500 | 0.4611 | 0.6370 | 0.8006 | 0.8280 | nan | 0.9187 | 0.5037 | 0.7983 | 0.8194 | 0.9176 | 0.8641 | 0.7823 | 0.0 | 0.8407 | 0.4436 | 0.7389 | 0.7951 | 0.7609 | 0.7936 | 0.7233 |
127
+ | 0.227 | 30.11 | 5600 | 0.4488 | 0.6304 | 0.7893 | 0.8314 | nan | 0.8984 | 0.4428 | 0.8197 | 0.7898 | 0.9108 | 0.8502 | 0.8135 | 0.0 | 0.8355 | 0.3967 | 0.7621 | 0.7707 | 0.7496 | 0.7875 | 0.7410 |
128
+ | 0.2627 | 30.65 | 5700 | 0.4513 | 0.6384 | 0.8022 | 0.8295 | nan | 0.8818 | 0.5132 | 0.8376 | 0.8269 | 0.9009 | 0.8641 | 0.7907 | 0.0 | 0.8304 | 0.4484 | 0.7652 | 0.8068 | 0.7548 | 0.7772 | 0.7245 |
129
+ | 0.1512 | 31.18 | 5800 | 0.4655 | 0.6151 | 0.7736 | 0.8222 | nan | 0.9195 | 0.3265 | 0.7978 | 0.8000 | 0.9074 | 0.8461 | 0.8181 | 0.0 | 0.8427 | 0.2993 | 0.7452 | 0.7803 | 0.7358 | 0.7824 | 0.7347 |
130
+ | 0.0861 | 31.72 | 5900 | 0.4754 | 0.6258 | 0.7850 | 0.8212 | nan | 0.9167 | 0.3799 | 0.8171 | 0.8288 | 0.9081 | 0.8549 | 0.7897 | 0.0 | 0.8467 | 0.3493 | 0.7586 | 0.8013 | 0.7459 | 0.7807 | 0.7235 |
131
+ | 0.1755 | 32.26 | 6000 | 0.5160 | 0.5938 | 0.7507 | 0.8166 | nan | 0.9229 | 0.1614 | 0.8288 | 0.7708 | 0.9159 | 0.8458 | 0.8096 | 0.0 | 0.8410 | 0.1522 | 0.7619 | 0.7540 | 0.7284 | 0.7849 | 0.7280 |
132
+ | 0.1556 | 32.8 | 6100 | 0.4894 | 0.6129 | 0.7734 | 0.8141 | nan | 0.8895 | 0.3034 | 0.8242 | 0.8482 | 0.8989 | 0.8603 | 0.7893 | 0.0 | 0.8294 | 0.2825 | 0.7532 | 0.8083 | 0.7317 | 0.7820 | 0.7158 |
133
+ | 0.0927 | 33.33 | 6200 | 0.4981 | 0.6298 | 0.7908 | 0.8216 | nan | 0.9017 | 0.4726 | 0.8020 | 0.8062 | 0.9026 | 0.8678 | 0.7829 | 0.0 | 0.8340 | 0.4196 | 0.7415 | 0.7908 | 0.7476 | 0.7910 | 0.7138 |
134
+ | 0.1746 | 33.87 | 6300 | 0.4925 | 0.6243 | 0.7844 | 0.8169 | nan | 0.9056 | 0.4036 | 0.8058 | 0.8331 | 0.8925 | 0.8574 | 0.7928 | 0.0 | 0.8421 | 0.3654 | 0.7480 | 0.8046 | 0.7419 | 0.7887 | 0.7037 |
135
+ | 0.2803 | 34.41 | 6400 | 0.5036 | 0.6192 | 0.7785 | 0.8179 | nan | 0.9037 | 0.4145 | 0.8096 | 0.7774 | 0.9187 | 0.8604 | 0.7655 | 0.0 | 0.8204 | 0.3805 | 0.7519 | 0.7603 | 0.7509 | 0.7855 | 0.7038 |
136
+ | 0.1777 | 34.95 | 6500 | 0.4886 | 0.6197 | 0.7787 | 0.8209 | nan | 0.9173 | 0.3337 | 0.8366 | 0.8071 | 0.9066 | 0.8615 | 0.7883 | 0.0 | 0.8405 | 0.3112 | 0.7654 | 0.7895 | 0.7486 | 0.7875 | 0.7149 |
137
+ | 0.1073 | 35.48 | 6600 | 0.4839 | 0.6271 | 0.7868 | 0.8245 | nan | 0.9297 | 0.3696 | 0.8335 | 0.8214 | 0.9102 | 0.8503 | 0.7930 | 0.0 | 0.8519 | 0.3424 | 0.7647 | 0.7957 | 0.7553 | 0.7908 | 0.7156 |
138
+ | 0.0958 | 36.02 | 6700 | 0.5011 | 0.6186 | 0.7744 | 0.8177 | nan | 0.9017 | 0.3398 | 0.8211 | 0.8092 | 0.8996 | 0.8509 | 0.7981 | 0.0 | 0.8383 | 0.3189 | 0.7575 | 0.7915 | 0.7508 | 0.7882 | 0.7037 |
139
+ | 0.1107 | 36.56 | 6800 | 0.4854 | 0.6291 | 0.7880 | 0.8229 | nan | 0.9108 | 0.3956 | 0.8209 | 0.8310 | 0.9093 | 0.8608 | 0.7876 | 0.0 | 0.8423 | 0.3649 | 0.7552 | 0.8064 | 0.7546 | 0.7949 | 0.7143 |
140
+ | 0.1541 | 37.1 | 6900 | 0.4965 | 0.6126 | 0.7684 | 0.8160 | nan | 0.9126 | 0.2968 | 0.7884 | 0.8303 | 0.9209 | 0.8291 | 0.8006 | 0.0 | 0.8398 | 0.2769 | 0.7381 | 0.8048 | 0.7358 | 0.7817 | 0.7240 |
141
+ | 0.445 | 37.63 | 7000 | 0.5060 | 0.6189 | 0.7769 | 0.8191 | nan | 0.9039 | 0.3292 | 0.8121 | 0.8313 | 0.9127 | 0.8582 | 0.7910 | 0.0 | 0.8355 | 0.3081 | 0.7533 | 0.8065 | 0.7455 | 0.7894 | 0.7132 |
142
+ | 0.2018 | 38.17 | 7100 | 0.5000 | 0.6144 | 0.7723 | 0.8270 | nan | 0.9077 | 0.2693 | 0.8312 | 0.8125 | 0.9103 | 0.8466 | 0.8282 | 0.0 | 0.8408 | 0.2498 | 0.7595 | 0.7957 | 0.7438 | 0.7901 | 0.7351 |
143
+ | 0.3123 | 38.71 | 7200 | 0.5074 | 0.6111 | 0.7691 | 0.8184 | nan | 0.9025 | 0.2503 | 0.8243 | 0.8337 | 0.8975 | 0.8623 | 0.8133 | 0.0 | 0.8378 | 0.2337 | 0.7602 | 0.8090 | 0.7412 | 0.7912 | 0.7153 |
144
+ | 0.1877 | 39.25 | 7300 | 0.5227 | 0.6147 | 0.7722 | 0.8159 | nan | 0.9113 | 0.3006 | 0.8216 | 0.8232 | 0.9182 | 0.8541 | 0.7766 | 0.0 | 0.8393 | 0.2827 | 0.7559 | 0.7979 | 0.7479 | 0.7922 | 0.7018 |
145
+ | 0.1139 | 39.78 | 7400 | 0.5134 | 0.6146 | 0.7727 | 0.8208 | nan | 0.9234 | 0.2839 | 0.8203 | 0.8118 | 0.9109 | 0.8565 | 0.8021 | 0.0 | 0.8449 | 0.2659 | 0.7558 | 0.7906 | 0.7470 | 0.7941 | 0.7181 |
146
+ | 0.2875 | 40.32 | 7500 | 0.4953 | 0.6309 | 0.7919 | 0.8246 | nan | 0.9165 | 0.4221 | 0.8274 | 0.8228 | 0.9026 | 0.8602 | 0.7917 | 0.0 | 0.8437 | 0.3822 | 0.7642 | 0.7975 | 0.7566 | 0.7930 | 0.7096 |
147
+ | 0.1543 | 40.86 | 7600 | 0.5131 | 0.6227 | 0.7814 | 0.8225 | nan | 0.9159 | 0.3537 | 0.8144 | 0.8157 | 0.9106 | 0.8651 | 0.7943 | 0.0 | 0.8452 | 0.3252 | 0.7587 | 0.7914 | 0.7537 | 0.7956 | 0.7120 |
148
+ | 0.0935 | 41.4 | 7700 | 0.4870 | 0.6333 | 0.7958 | 0.8326 | nan | 0.9115 | 0.4304 | 0.8214 | 0.8199 | 0.9069 | 0.8677 | 0.8127 | 0.0 | 0.8401 | 0.3865 | 0.7595 | 0.7926 | 0.7518 | 0.7931 | 0.7424 |
149
+ | 0.1449 | 41.94 | 7800 | 0.5064 | 0.6131 | 0.7718 | 0.8177 | nan | 0.9140 | 0.2785 | 0.8342 | 0.8206 | 0.9070 | 0.8563 | 0.7919 | 0.0 | 0.8467 | 0.2571 | 0.7621 | 0.7934 | 0.7425 | 0.7947 | 0.7085 |
150
+ | 0.196 | 42.47 | 7900 | 0.4914 | 0.6158 | 0.7747 | 0.8250 | nan | 0.9065 | 0.3182 | 0.8195 | 0.7971 | 0.9159 | 0.8592 | 0.8066 | 0.0 | 0.8378 | 0.2900 | 0.7554 | 0.7732 | 0.7429 | 0.7941 | 0.7334 |
151
+ | 0.0902 | 43.01 | 8000 | 0.5049 | 0.6092 | 0.7678 | 0.8156 | nan | 0.9207 | 0.2618 | 0.8027 | 0.8255 | 0.9075 | 0.8543 | 0.8020 | 0.0 | 0.8510 | 0.2407 | 0.7415 | 0.7976 | 0.7388 | 0.7934 | 0.7108 |
152
+ | 0.0981 | 43.55 | 8100 | 0.5101 | 0.6028 | 0.7602 | 0.8140 | nan | 0.9289 | 0.2040 | 0.8124 | 0.8145 | 0.9034 | 0.8485 | 0.8094 | 0.0 | 0.8519 | 0.1903 | 0.7504 | 0.7913 | 0.7421 | 0.7891 | 0.7070 |
153
+ | 0.0804 | 44.09 | 8200 | 0.5136 | 0.6187 | 0.7777 | 0.8179 | nan | 0.9224 | 0.3260 | 0.8225 | 0.8244 | 0.9141 | 0.8543 | 0.7800 | 0.0 | 0.8514 | 0.3008 | 0.7543 | 0.7957 | 0.7521 | 0.7924 | 0.7033 |
154
+ | 0.125 | 44.62 | 8300 | 0.5089 | 0.6182 | 0.7770 | 0.8184 | nan | 0.9165 | 0.3390 | 0.8121 | 0.8162 | 0.9054 | 0.8558 | 0.7939 | 0.0 | 0.8459 | 0.3079 | 0.7522 | 0.7915 | 0.7521 | 0.7904 | 0.7052 |
155
+ | 0.1567 | 45.16 | 8400 | 0.5128 | 0.6093 | 0.7676 | 0.8178 | nan | 0.9259 | 0.2550 | 0.8216 | 0.8056 | 0.9131 | 0.8573 | 0.7945 | 0.0 | 0.8460 | 0.2413 | 0.7548 | 0.7802 | 0.7473 | 0.7921 | 0.7124 |
156
+ | 0.1533 | 45.7 | 8500 | 0.5073 | 0.6144 | 0.7719 | 0.8183 | nan | 0.9250 | 0.3017 | 0.8238 | 0.7985 | 0.9106 | 0.8532 | 0.7905 | 0.0 | 0.8477 | 0.2807 | 0.7582 | 0.7781 | 0.7505 | 0.7924 | 0.7075 |
157
+ | 0.1264 | 46.24 | 8600 | 0.5117 | 0.6175 | 0.7771 | 0.8281 | nan | 0.9186 | 0.3045 | 0.8219 | 0.8005 | 0.9130 | 0.8633 | 0.8178 | 0.0 | 0.8426 | 0.2823 | 0.7565 | 0.7777 | 0.7496 | 0.7957 | 0.7358 |
158
+ | 0.196 | 46.77 | 8700 | 0.5079 | 0.6199 | 0.7785 | 0.8230 | nan | 0.9060 | 0.3527 | 0.8219 | 0.8023 | 0.9128 | 0.8590 | 0.7949 | 0.0 | 0.8362 | 0.3234 | 0.7564 | 0.7803 | 0.7515 | 0.7915 | 0.7200 |
159
+ | 0.1576 | 47.31 | 8800 | 0.5046 | 0.6192 | 0.7778 | 0.8236 | nan | 0.9132 | 0.3347 | 0.8159 | 0.8091 | 0.9072 | 0.8549 | 0.8096 | 0.0 | 0.8410 | 0.3079 | 0.7548 | 0.7871 | 0.7504 | 0.7915 | 0.7211 |
160
+ | 0.1362 | 47.85 | 8900 | 0.5086 | 0.6226 | 0.7815 | 0.8228 | nan | 0.9132 | 0.3548 | 0.8207 | 0.8199 | 0.9185 | 0.8555 | 0.7881 | 0.0 | 0.8427 | 0.3250 | 0.7574 | 0.7954 | 0.7543 | 0.7932 | 0.7124 |
161
+ | 0.1221 | 48.39 | 9000 | 0.5128 | 0.6172 | 0.7760 | 0.8201 | nan | 0.9197 | 0.3168 | 0.8285 | 0.8051 | 0.9096 | 0.8635 | 0.7887 | 0.0 | 0.8438 | 0.2946 | 0.7603 | 0.7838 | 0.7511 | 0.7937 | 0.7101 |
162
+ | 0.1495 | 48.92 | 9100 | 0.5083 | 0.6219 | 0.7808 | 0.8212 | nan | 0.9145 | 0.3516 | 0.8284 | 0.8180 | 0.9109 | 0.8546 | 0.7879 | 0.0 | 0.8451 | 0.3218 | 0.7593 | 0.7943 | 0.7525 | 0.7924 | 0.7096 |
163
+ | 0.2303 | 49.46 | 9200 | 0.5041 | 0.6165 | 0.7747 | 0.8238 | nan | 0.9155 | 0.3038 | 0.8206 | 0.8039 | 0.9129 | 0.8603 | 0.8058 | 0.0 | 0.8434 | 0.2804 | 0.7582 | 0.7845 | 0.7494 | 0.7953 | 0.7211 |
164
+ | 0.1028 | 50.0 | 9300 | 0.5044 | 0.6138 | 0.7713 | 0.8213 | nan | 0.9139 | 0.2842 | 0.8156 | 0.8117 | 0.9098 | 0.8540 | 0.8096 | 0.0 | 0.8428 | 0.2638 | 0.7560 | 0.7896 | 0.7482 | 0.7927 | 0.7176 |
165
+
166
+
167
+ ### Framework versions
168
+
169
+ - Transformers 4.33.0
170
+ - Pytorch 2.0.0
171
+ - Datasets 2.1.0
172
+ - Tokenizers 0.13.3
config.json ADDED
@@ -0,0 +1,90 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "Background (waterbody)",
32
+ "1": "Human divers",
33
+ "2": "Aquatic plants and sea-grass",
34
+ "3": "Wrecks and ruins",
35
+ "4": "Robots (AUVs/ROVs/instruments)",
36
+ "5": "Reefs and invertebrates",
37
+ "6": "Fish and vertebrates",
38
+ "7": "Sea-floor and rocks"
39
+ },
40
+ "image_size": 224,
41
+ "initializer_range": 0.02,
42
+ "label2id": {
43
+ "Aquatic plants and sea-grass": 2,
44
+ "Background (waterbody)": 0,
45
+ "Fish and vertebrates": 6,
46
+ "Human divers": 1,
47
+ "Reefs and invertebrates": 5,
48
+ "Robots (AUVs/ROVs/instruments)": 4,
49
+ "Sea-floor and rocks": 7,
50
+ "Wrecks and ruins": 3
51
+ },
52
+ "layer_norm_eps": 1e-06,
53
+ "mlp_ratios": [
54
+ 4,
55
+ 4,
56
+ 4,
57
+ 4
58
+ ],
59
+ "model_type": "segformer",
60
+ "num_attention_heads": [
61
+ 1,
62
+ 2,
63
+ 5,
64
+ 8
65
+ ],
66
+ "num_channels": 3,
67
+ "num_encoder_blocks": 4,
68
+ "patch_sizes": [
69
+ 7,
70
+ 3,
71
+ 3,
72
+ 3
73
+ ],
74
+ "reshape_last_stage": true,
75
+ "semantic_loss_ignore_index": 255,
76
+ "sr_ratios": [
77
+ 8,
78
+ 4,
79
+ 2,
80
+ 1
81
+ ],
82
+ "strides": [
83
+ 4,
84
+ 2,
85
+ 2,
86
+ 2
87
+ ],
88
+ "torch_dtype": "float32",
89
+ "transformers_version": "4.33.0"
90
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:53fddfaf5fb45d7add5510fb984442356da4a57c7691f4c8a6b312a9fb70b75c
3
+ size 14937933
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4fac5b7d2ba73522db2f0dd9aba63b34e4927f77e439b095331a79ef03b83eef
3
+ size 4091