Edit model card

segformer-b0-finetuned-100by100PNG-50epochs-attempt2-100epochsNoReduce

This model is a fine-tuned version of nvidia/mit-b0 on the JCAI2000/100By100BranchPNG dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2286
  • Mean Iou: 0.8224
  • Mean Accuracy: 1.0
  • Overall Accuracy: 1.0
  • Accuracy Branch: 1.0
  • Iou Branch: 0.8224

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Branch Iou Branch
0.5864 1.05 20 0.6005 0.8224 1.0 1.0 1.0 0.8224
0.4461 2.11 40 0.4242 0.8224 1.0 1.0 1.0 0.8224
0.2556 3.16 60 0.3245 0.8224 1.0 1.0 1.0 0.8224
0.234 4.21 80 0.3176 0.8224 1.0 1.0 1.0 0.8224
0.1917 5.26 100 0.2751 0.8224 1.0 1.0 1.0 0.8224
0.2608 6.32 120 0.2997 0.8224 1.0 1.0 1.0 0.8224
0.2789 7.37 140 0.2508 0.8224 1.0 1.0 1.0 0.8224
0.2173 8.42 160 0.2684 0.8224 1.0 1.0 1.0 0.8224
0.1552 9.47 180 0.2374 0.8224 1.0 1.0 1.0 0.8224
0.1809 10.53 200 0.2596 0.8224 1.0 1.0 1.0 0.8224
0.1342 11.58 220 0.2375 0.8224 1.0 1.0 1.0 0.8224
0.1946 12.63 240 0.2211 0.8224 1.0 1.0 1.0 0.8224
0.1215 13.68 260 0.2135 0.8224 1.0 1.0 1.0 0.8224
0.1212 14.74 280 0.2470 0.8224 1.0 1.0 1.0 0.8224
0.1536 15.79 300 0.2224 0.8224 1.0 1.0 1.0 0.8224
0.1282 16.84 320 0.2466 0.8224 1.0 1.0 1.0 0.8224
0.11 17.89 340 0.2316 0.8224 1.0 1.0 1.0 0.8224
0.1228 18.95 360 0.2233 0.8224 1.0 1.0 1.0 0.8224
0.1243 20.0 380 0.1996 0.8224 1.0 1.0 1.0 0.8224
0.0893 21.05 400 0.2074 0.8224 1.0 1.0 1.0 0.8224
0.1012 22.11 420 0.1941 0.8224 1.0 1.0 1.0 0.8224
0.1587 23.16 440 0.2007 0.8224 1.0 1.0 1.0 0.8224
0.0913 24.21 460 0.2211 0.8224 1.0 1.0 1.0 0.8224
0.0949 25.26 480 0.2621 0.8224 1.0 1.0 1.0 0.8224
0.0863 26.32 500 0.2195 0.8224 1.0 1.0 1.0 0.8224
0.066 27.37 520 0.2221 0.8224 1.0 1.0 1.0 0.8224
0.0738 28.42 540 0.2126 0.8224 1.0 1.0 1.0 0.8224
0.0808 29.47 560 0.2068 0.8224 1.0 1.0 1.0 0.8224
0.062 30.53 580 0.2599 0.8224 1.0 1.0 1.0 0.8224
0.0787 31.58 600 0.2366 0.8224 1.0 1.0 1.0 0.8224
0.0535 32.63 620 0.2165 0.8224 1.0 1.0 1.0 0.8224
0.0681 33.68 640 0.2212 0.8224 1.0 1.0 1.0 0.8224
0.0574 34.74 660 0.2160 0.8224 1.0 1.0 1.0 0.8224
0.109 35.79 680 0.2281 0.8224 1.0 1.0 1.0 0.8224
0.0702 36.84 700 0.2403 0.8224 1.0 1.0 1.0 0.8224
0.0578 37.89 720 0.2141 0.8224 1.0 1.0 1.0 0.8224
0.0643 38.95 740 0.2101 0.8224 1.0 1.0 1.0 0.8224
0.0948 40.0 760 0.2118 0.8224 1.0 1.0 1.0 0.8224
0.0453 41.05 780 0.2048 0.8224 1.0 1.0 1.0 0.8224
0.0472 42.11 800 0.1924 0.8224 1.0 1.0 1.0 0.8224
0.0699 43.16 820 0.2197 0.8224 1.0 1.0 1.0 0.8224
0.0492 44.21 840 0.2172 0.8224 1.0 1.0 1.0 0.8224
0.0888 45.26 860 0.2196 0.8224 1.0 1.0 1.0 0.8224
0.0438 46.32 880 0.2196 0.8224 1.0 1.0 1.0 0.8224
0.0524 47.37 900 0.2232 0.8224 1.0 1.0 1.0 0.8224
0.0453 48.42 920 0.2184 0.8224 1.0 1.0 1.0 0.8224
0.1319 49.47 940 0.2080 0.8224 1.0 1.0 1.0 0.8224
0.0423 50.53 960 0.2180 0.8224 1.0 1.0 1.0 0.8224
0.0592 51.58 980 0.2251 0.8224 1.0 1.0 1.0 0.8224
0.0395 52.63 1000 0.2198 0.8224 1.0 1.0 1.0 0.8224
0.0451 53.68 1020 0.1953 0.8224 1.0 1.0 1.0 0.8224
0.0446 54.74 1040 0.2072 0.8224 1.0 1.0 1.0 0.8224
0.048 55.79 1060 0.2222 0.8224 1.0 1.0 1.0 0.8224
0.0623 56.84 1080 0.2264 0.8224 1.0 1.0 1.0 0.8224
0.0765 57.89 1100 0.2419 0.8224 1.0 1.0 1.0 0.8224
0.0661 58.95 1120 0.2251 0.8224 1.0 1.0 1.0 0.8224
0.0625 60.0 1140 0.2254 0.8224 1.0 1.0 1.0 0.8224
0.0373 61.05 1160 0.2209 0.8224 1.0 1.0 1.0 0.8224
0.0402 62.11 1180 0.2178 0.8224 1.0 1.0 1.0 0.8224
0.0339 63.16 1200 0.2076 0.8224 1.0 1.0 1.0 0.8224
0.0608 64.21 1220 0.2177 0.8224 1.0 1.0 1.0 0.8224
0.0301 65.26 1240 0.2048 0.8224 1.0 1.0 1.0 0.8224
0.0461 66.32 1260 0.2124 0.8224 1.0 1.0 1.0 0.8224
0.0395 67.37 1280 0.2188 0.8224 1.0 1.0 1.0 0.8224
0.1034 68.42 1300 0.2251 0.8224 1.0 1.0 1.0 0.8224
0.0368 69.47 1320 0.2169 0.8224 1.0 1.0 1.0 0.8224
0.0501 70.53 1340 0.2180 0.8224 1.0 1.0 1.0 0.8224
0.0417 71.58 1360 0.2255 0.8224 1.0 1.0 1.0 0.8224
0.0351 72.63 1380 0.2154 0.8224 1.0 1.0 1.0 0.8224
0.0423 73.68 1400 0.2216 0.8224 1.0 1.0 1.0 0.8224
0.032 74.74 1420 0.2211 0.8224 1.0 1.0 1.0 0.8224
0.0586 75.79 1440 0.2144 0.8224 1.0 1.0 1.0 0.8224
0.0315 76.84 1460 0.2166 0.8224 1.0 1.0 1.0 0.8224
0.039 77.89 1480 0.2258 0.8224 1.0 1.0 1.0 0.8224
0.0838 78.95 1500 0.2290 0.8224 1.0 1.0 1.0 0.8224
0.0543 80.0 1520 0.2414 0.8224 1.0 1.0 1.0 0.8224
0.0444 81.05 1540 0.2240 0.8224 1.0 1.0 1.0 0.8224
0.0501 82.11 1560 0.2253 0.8224 1.0 1.0 1.0 0.8224
0.0275 83.16 1580 0.2190 0.8224 1.0 1.0 1.0 0.8224
0.0356 84.21 1600 0.2188 0.8224 1.0 1.0 1.0 0.8224
0.0378 85.26 1620 0.2214 0.8224 1.0 1.0 1.0 0.8224
0.0306 86.32 1640 0.2195 0.8224 1.0 1.0 1.0 0.8224
0.0348 87.37 1660 0.2172 0.8224 1.0 1.0 1.0 0.8224
0.0363 88.42 1680 0.2180 0.8224 1.0 1.0 1.0 0.8224
0.0454 89.47 1700 0.2234 0.8224 1.0 1.0 1.0 0.8224
0.0367 90.53 1720 0.2224 0.8224 1.0 1.0 1.0 0.8224
0.0265 91.58 1740 0.2308 0.8224 1.0 1.0 1.0 0.8224
0.0404 92.63 1760 0.2269 0.8224 1.0 1.0 1.0 0.8224
0.0335 93.68 1780 0.2229 0.8224 1.0 1.0 1.0 0.8224
0.0292 94.74 1800 0.2269 0.8224 1.0 1.0 1.0 0.8224
0.041 95.79 1820 0.2277 0.8224 1.0 1.0 1.0 0.8224
0.0325 96.84 1840 0.2225 0.8224 1.0 1.0 1.0 0.8224
0.0371 97.89 1860 0.2250 0.8224 1.0 1.0 1.0 0.8224
0.0336 98.95 1880 0.2259 0.8224 1.0 1.0 1.0 0.8224
0.0445 100.0 1900 0.2286 0.8224 1.0 1.0 1.0 0.8224

Framework versions

  • Transformers 4.33.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
3
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from