metadata
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-busigt2
results: []
segformer-b0-finetuned-busigt2
This model is a fine-tuned version of nvidia/mit-b1 on the kasumi222/busigt5 dataset. It achieves the following results on the evaluation set:
- Loss: 0.1923
- Mean Iou: 0.4456
- Mean Accuracy: 0.6990
- Overall Accuracy: 0.6980
- Per Category Iou: [0.0, 0.6613256012770924, 0.6755795107848668]
- Per Category Accuracy: [nan, 0.6930571879874813, 0.7049128240257888]
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.00013
- train_batch_size: 20
- eval_batch_size: 20
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
---|---|---|---|---|---|---|---|---|
0.44 | 0.77 | 20 | 0.4848 | 0.2588 | 0.4242 | 0.4165 | [0.0, 0.3504134718879466, 0.42602074549863] | [nan, 0.3783013333071708, 0.47012910920923223] |
0.3746 | 1.54 | 40 | 0.3811 | 0.3111 | 0.5104 | 0.5161 | [0.0, 0.4842565751294053, 0.4491183491823408] | [nan, 0.5445819066586871, 0.4761423846712137] |
0.3319 | 2.31 | 60 | 0.3142 | 0.2399 | 0.3819 | 0.3648 | [0.0, 0.2677680368240358, 0.45190264244757206] | [nan, 0.279724508717023, 0.4840686746895068] |
0.2211 | 3.08 | 80 | 0.2530 | 0.2975 | 0.4750 | 0.4814 | [0.0, 0.4706420141972068, 0.4218805055880307] | [nan, 0.5129361209493069, 0.4371271301147258] |
0.2097 | 3.85 | 100 | 0.2433 | 0.2900 | 0.4480 | 0.4460 | [0.0, 0.43160250466254296, 0.43845878928113535] | [nan, 0.43601714952858417, 0.4600634808054265] |
0.1869 | 4.62 | 120 | 0.2363 | 0.3632 | 0.5993 | 0.6166 | [0.0, 0.6101054609134765, 0.4795020991617725] | [nan, 0.7024776555770306, 0.4960379563233556] |
0.1632 | 5.38 | 140 | 0.2386 | 0.4353 | 0.6988 | 0.7093 | [0.0, 0.6991547342129519, 0.6066344865447587] | [nan, 0.7611584761643136, 0.6364771236719307] |
0.1939 | 6.15 | 160 | 0.2166 | 0.3374 | 0.5337 | 0.5229 | [0.0, 0.46597040423540376, 0.5460862797699997] | [nan, 0.4692625114052214, 0.5981270739467682] |
0.2074 | 6.92 | 180 | 0.2209 | 0.3219 | 0.5524 | 0.5826 | [0.0, 0.5973031615126874, 0.3684272471225944] | [nan, 0.7324595053322476, 0.37233573901062894] |
0.1243 | 7.69 | 200 | 0.2214 | 0.3890 | 0.6490 | 0.6624 | [0.0, 0.6202500066793128, 0.5468835150043967] | [nan, 0.728845938760093, 0.5691879104528569] |
0.1079 | 8.46 | 220 | 0.2349 | 0.3889 | 0.6489 | 0.6659 | [0.0, 0.6357100818960788, 0.5310789782806924] | [nan, 0.7502934453088975, 0.5474674308445788] |
0.1355 | 9.23 | 240 | 0.1988 | 0.4417 | 0.6835 | 0.6772 | [0.0, 0.6347203785731718, 0.6904229863937434] | [nan, 0.6460698342931706, 0.7208882026363698] |
0.1258 | 10.0 | 260 | 0.1985 | 0.4181 | 0.6597 | 0.6552 | [0.0, 0.5971057628808589, 0.6571229098113397] | [nan, 0.6326629842926801, 0.6867490977333476] |
0.1098 | 10.77 | 280 | 0.1959 | 0.4578 | 0.7091 | 0.7047 | [0.0, 0.6732406203240918, 0.7002946319094014] | [nan, 0.6828090692358256, 0.7353112530851077] |
0.097 | 11.54 | 300 | 0.1968 | 0.4401 | 0.6784 | 0.6719 | [0.0, 0.6352861327900514, 0.6850189737146176] | [nan, 0.6398854081842887, 0.7169096389721925] |
0.0844 | 12.31 | 320 | 0.1959 | 0.4164 | 0.6610 | 0.6637 | [0.0, 0.6274020120203341, 0.6218360931423603] | [nan, 0.677014333787907, 0.645067517189047] |
0.1543 | 13.08 | 340 | 0.2004 | 0.4261 | 0.6663 | 0.6626 | [0.0, 0.618565945734453, 0.6597620117886153] | [nan, 0.6440009026067676, 0.6886109003283072] |
0.0871 | 13.85 | 360 | 0.1967 | 0.4270 | 0.6699 | 0.6664 | [0.0, 0.619536460193106, 0.6615374800234315] | [nan, 0.6488875371589471, 0.6909044252641271] |
0.1012 | 14.62 | 380 | 0.1923 | 0.4456 | 0.6990 | 0.6980 | [0.0, 0.6613256012770924, 0.6755795107848668] | [nan, 0.6930571879874813, 0.7049128240257888] |
Framework versions
- Transformers 4.21.3
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1