Edit model card

segformer-b0-scene-parse-150-lr-5-e-30

This model is a fine-tuned version of DiTo97/binarization-segformer-b3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1499
  • Mean Iou: 0.4846
  • Mean Accuracy: 0.5002
  • Overall Accuracy: 0.9687
  • Per Category Iou: [0.0004776099180272339, 0.9687318368688158]
  • Per Category Accuracy: [0.0004786768150677956, 0.9999280293990062]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
No log 1.0 112 2.2080 0.0208 0.4867 0.0409 [0.030359692962016637, 0.011234027937699289] [0.9620657111236344, 0.011247758028276789]
No log 2.0 224 1.4759 0.0198 0.4977 0.0391 [0.03105738195300228, 0.008579759290056214] [0.9867430286323423, 0.008583423879932913]
No log 3.0 336 0.9545 0.2044 0.5106 0.3895 [0.03167545114091724, 0.3770365875250204] [0.63983643511383, 0.38141170005716085]
No log 4.0 448 0.9261 0.1630 0.5110 0.3102 [0.03177509935992916, 0.2941818944688777] [0.7253108004426911, 0.29678542982527334]
1.4226 5.0 560 0.5223 0.4799 0.5020 0.9428 [0.017040433028331244, 0.9427249005417263] [0.031779387701061236, 0.9721328848551324]
1.4226 6.0 672 0.2859 0.4844 0.5000 0.9661 [0.0026066960283743228, 0.9661016634825252] [0.0028381121801182773, 0.997139715499558]
1.4226 7.0 784 0.3327 0.4845 0.5001 0.9666 [0.0023741570079603156, 0.9666249471662205] [0.0025444558361227857, 0.9976889561878419]
1.4226 8.0 896 0.3413 0.4861 0.5020 0.9655 [0.006810437487136094, 0.965477402345672] [0.0075824444429356125, 0.996347820132999]
0.4243 9.0 1008 0.2237 0.4845 0.5001 0.9670 [0.002003100646207948, 0.9669488069696599] [0.002125189264059858, 0.998036285433216]
0.4243 10.0 1120 0.2312 0.4845 0.5001 0.9673 [0.0017493474271832114, 0.9672550380074268] [0.001838322662122066, 0.9983613016047257]
0.4243 11.0 1232 0.2255 0.4846 0.5001 0.9683 [0.0008165323590605036, 0.9682956004963261] [0.0008300459665537307, 0.9994667831613294]
0.4243 12.0 1344 0.2229 0.4845 0.5001 0.9681 [0.0009280118466771697, 0.9681143324063521] [0.0009488664525634671, 0.9992759735664452]
0.4243 13.0 1456 0.1762 0.4844 0.5000 0.9688 [0.0, 0.9687853389316134] [0.0, 0.9999981952660845]
0.2658 14.0 1568 0.1799 0.4844 0.5000 0.9688 [0.0, 0.9687686496310763] [0.0, 0.9999809682605274]
0.2658 15.0 1680 0.1895 0.4844 0.5000 0.9688 [0.0, 0.9687842263115777] [0.0, 0.9999970467990473]
0.2658 16.0 1792 0.1751 0.4846 0.5002 0.9687 [0.0005055791248239351, 0.9686817929933391] [0.0005075332188130173, 0.9998754733598305]
0.2658 17.0 1904 0.1570 0.4844 0.5000 0.9688 [0.0, 0.9687863985697428] [0.0, 0.9999992890442151]
0.2281 18.0 2016 0.2391 0.4892 0.5049 0.9676 [0.010804445997179198, 0.9675856416536037] [0.01133886923635771, 0.9984063652637066]
0.2281 19.0 2128 0.1763 0.4846 0.5002 0.9687 [0.00047390236595756206, 0.9687105941224816] [0.00047528194403894595, 0.9999062085253005]
0.2281 20.0 2240 0.1614 0.4844 0.5000 0.9688 [8.31166363884634e-05, 0.96876793276621] [8.317434020681555e-05, 0.9999776322372291]
0.2281 21.0 2352 0.1576 0.4844 0.5000 0.9688 [0.0, 0.9687801996866862] [0.0, 0.999992890442151]
0.2281 22.0 2464 0.1509 0.4848 0.5004 0.9687 [0.0008725737882892481, 0.9686954205214706] [0.0008758767254432005, 0.9998780437384374]
0.2197 23.0 2576 0.1575 0.4844 0.5000 0.9688 [0.00013731233980227023, 0.9687503427281783] [0.00013749227666840936, 0.9999577801641586]
0.2197 24.0 2688 0.1522 0.4847 0.5002 0.9687 [0.0006022476015827608, 0.96869965961662] [0.0006042870431352313, 0.999890895631472]
0.2197 25.0 2800 0.1532 0.4844 0.5000 0.9688 [0.0, 0.9687816831800673] [0.0, 0.9999944217315339]
0.2197 26.0 2912 0.1589 0.4844 0.5000 0.9688 [7.462281558667101e-05, 0.9687624309087838] [7.46871626346915e-05, 0.9999722180354826]
0.2129 27.0 3024 0.1612 0.4845 0.5001 0.9687 [0.0003791835028091742, 0.9687128062602468] [0.00038022555523115677, 0.9999114586603274]
0.2129 28.0 3136 0.1526 0.4844 0.5000 0.9688 [4.750029687685548e-05, 0.9687701927590581] [4.7528194403894597e-05, 0.9999810776383405]
0.2129 29.0 3248 0.1501 0.4844 0.5000 0.9688 [2.375603233535373e-05, 0.9687772095405226] [2.3764097201947298e-05, 0.999989062218694]
0.2129 30.0 3360 0.1499 0.4846 0.5002 0.9687 [0.0004776099180272339, 0.9687318368688158] [0.0004786768150677956, 0.9999280293990062]

Framework versions

  • Transformers 4.37.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
1
Safetensors
Model size
47.2M params
Tensor type
F32
·
Unable to determine this model’s pipeline type. Check the docs .

Finetuned from