Edit model card

segformer-b0-scene-parse-150-lr-4-e-30-new-9img_FFT

This model is a fine-tuned version of DiTo97/binarization-segformer-b3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1542
  • Mean Iou: 0.5040
  • Mean Accuracy: 0.5233
  • Overall Accuracy: 0.9668
  • Per Category Iou: [0.04118617051522149, 0.9667182726602117]
  • Per Category Accuracy: [0.05601009748261377, 0.9905862883445109]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
No log 1.0 131 0.1528 0.4873 0.5 0.9745 [0.0, 0.9745119730631511] [0.0, 1.0]
No log 2.0 262 0.1889 0.4873 0.5 0.9745 [0.0, 0.9745119730631511] [0.0, 1.0]
No log 3.0 393 0.1197 0.4873 0.5 0.9745 [0.0, 0.9745119730631511] [0.0, 1.0]
0.4149 4.0 524 0.1201 0.4873 0.5000 0.9745 [0.00010223979970973872, 0.9745065024545415] [0.00010227193358809854, 0.9999917796143308]
0.4149 5.0 655 0.1441 0.4873 0.5000 0.9745 [2.7437511068541397e-05, 0.9745114466111445] [2.7438811450465463e-05, 0.999998760418034]
0.4149 6.0 786 0.1309 0.4873 0.5001 0.9745 [0.0001570390902700574, 0.9744979472713902] [0.00015714955648902945, 0.9999816019939785]
0.4149 7.0 917 0.1319 0.4874 0.5002 0.9745 [0.0003957310820631776, 0.9744648577316526] [0.0003966155473294553, 0.9999415439241303]
0.1817 8.0 1048 0.1295 0.4881 0.5009 0.9745 [0.0018197547805853048, 0.9744708577898599] [0.0018259281801582471, 0.9999112720276977]
0.1817 9.0 1179 0.1309 0.4920 0.5047 0.9742 [0.00972514515778975, 0.9741753903361423] [0.009947816369496024, 0.9994011514281158]
0.1817 10.0 1310 0.1343 0.4895 0.5022 0.9743 [0.004785912791126633, 0.9742689722328164] [0.004854175189327799, 0.9996269510693939]
0.1817 11.0 1441 0.1212 0.4894 0.5022 0.9740 [0.0048241029726707255, 0.9740276066246742] [0.004938986061083783, 0.9993771426826694]
0.1615 12.0 1572 0.1346 0.4931 0.5059 0.9737 [0.012490729722833617, 0.9736626335700124] [0.013065863125230736, 0.9987957134994652]
0.1615 13.0 1703 0.1258 0.4912 0.5040 0.9740 [0.008481020628593294, 0.9739639804750839] [0.008735519790866368, 0.9992151488920616]
0.1615 14.0 1834 0.1365 0.4989 0.5125 0.9727 [0.025065780362830633, 0.9726777607699496] [0.02754108338405356, 0.9974171678709749]
0.1615 15.0 1965 0.1296 0.4969 0.5101 0.9732 [0.02060107372759099, 0.9731805906599663] [0.022120670903884337, 0.9980707537728308]
0.144 16.0 2096 0.1436 0.5050 0.5219 0.9697 [0.040424047311196594, 0.9696216140306383] [0.05014567514442793, 0.9937100348987993]
0.144 17.0 2227 0.1446 0.5023 0.5177 0.9707 [0.03386431595215697, 0.970717796701266] [0.04022779202378696, 0.9950852532283281]
0.144 18.0 2358 0.1420 0.5028 0.5179 0.9712 [0.03438079444116839, 0.971131946623355] [0.040285164084092474, 0.9955083421256665]
0.144 19.0 2489 0.1403 0.5018 0.5171 0.9706 [0.032975986996447486, 0.9706187216781221] [0.03926992806042525, 0.9950080076995003]
0.1327 20.0 2620 0.1390 0.4983 0.5121 0.9720 [0.024489823088375932, 0.9720288138083142] [0.02753110563443521, 0.9967519690433324]
0.1327 21.0 2751 0.1469 0.5044 0.5221 0.9686 [0.04024195335923191, 0.9685753252351375] [0.051627370962753064, 0.9926002175923039]
0.1327 22.0 2882 0.1419 0.4993 0.5137 0.9714 [0.027247862213791583, 0.9713509036761706] [0.031459844546660946, 0.9959570055561978]
0.1223 23.0 3013 0.1452 0.5014 0.5171 0.9700 [0.03283676275044511, 0.9699570020594283] [0.03997834828332818, 0.9943116888403959]
0.1223 24.0 3144 0.1492 0.5028 0.5202 0.9684 [0.03726474521737445, 0.9683817614075229] [0.04795805354060445, 0.9924947878840388]
0.1223 25.0 3275 0.1432 0.4988 0.5133 0.9711 [0.026574581969862024, 0.9710595860086739] [0.030973429252766332, 0.9956706621220547]
0.1223 26.0 3406 0.1483 0.5022 0.5190 0.9688 [0.035495434136009404, 0.9688123464095088] [0.04497969527952666, 0.9930115630815434]
0.1163 27.0 3537 0.1511 0.5033 0.5211 0.9681 [0.03854419663037566, 0.9680458059790112] [0.05019556389251968, 0.9920938157386199]
0.1163 28.0 3668 0.1509 0.5034 0.5218 0.9675 [0.03933910306845004, 0.9674300294951071] [0.05225846362611376, 0.9914105451107365]
0.1163 29.0 3799 0.1559 0.5050 0.5268 0.9649 [0.04520760305755583, 0.9648560029666593] [0.06517715494447382, 0.9884467046300605]
0.1163 30.0 3930 0.1542 0.5040 0.5233 0.9668 [0.04118617051522149, 0.9667182726602117] [0.05601009748261377, 0.9905862883445109]

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
1
Safetensors
Model size
47.2M params
Tensor type
F32
·
Unable to determine this model’s pipeline type. Check the docs .

Finetuned from