Edit model card

segformer-b0-practice-7-11

This model is a fine-tuned version of nvidia/mit-b0 on the chugz/SEM dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1941
  • Mean Iou: 0.5755
  • Mean Accuracy: 0.7536
  • Overall Accuracy: 0.9340
  • Accuracy Background: nan
  • Accuracy Silver: 0.9537
  • Accuracy Glass: 0.0749
  • Accuracy Silicon: 0.9909
  • Accuracy Void: 0.8095
  • Accuracy Interfacial void: 0.9391
  • Iou Background: 0.0
  • Iou Silver: 0.8956
  • Iou Glass: 0.0717
  • Iou Silicon: 0.9803
  • Iou Void: 0.6885
  • Iou Interfacial void: 0.8168

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Silver Accuracy Glass Accuracy Silicon Accuracy Void Accuracy Interfacial void Iou Background Iou Silver Iou Glass Iou Silicon Iou Void Iou Interfacial void
1.2415 0.6061 20 1.3040 0.2800 0.4306 0.7396 nan 0.9612 0.0069 0.9153 0.0 0.2698 0.0 0.5882 0.0069 0.8301 0.0 0.2549
0.9185 1.2121 40 0.7957 0.3642 0.5203 0.8249 nan 0.9524 0.0 0.9539 0.0000 0.6951 0.0 0.7181 0.0 0.8980 0.0000 0.5691
0.6451 1.8182 60 0.6017 0.3679 0.5206 0.8131 nan 0.8835 0.0 0.9456 0.0049 0.7688 0.0 0.6888 0.0 0.9053 0.0049 0.6086
0.5145 2.4242 80 0.5779 0.4033 0.5635 0.8393 nan 0.9514 0.0 0.9371 0.1596 0.7693 0.0 0.7321 0.0 0.9199 0.1516 0.6165
0.9509 3.0303 100 0.5006 0.4017 0.5540 0.8420 nan 0.9510 0.0 0.9548 0.1047 0.7592 0.0 0.7312 0.0 0.9273 0.1010 0.6506
0.6075 3.6364 120 0.4274 0.4688 0.6372 0.8776 nan 0.9414 0.0 0.9613 0.4300 0.8535 0.0 0.7953 0.0 0.9395 0.3709 0.7070
0.4155 4.2424 140 0.3949 0.4994 0.6781 0.8912 nan 0.9260 0.0 0.9560 0.5908 0.9177 0.0 0.8258 0.0 0.9412 0.4981 0.7313
0.4072 4.8485 160 0.3538 0.5040 0.6794 0.8961 nan 0.9463 0.0 0.9668 0.6083 0.8758 0.0 0.8231 0.0 0.9540 0.4978 0.7493
0.3085 5.4545 180 0.3441 0.5033 0.6814 0.8919 nan 0.9506 0.0 0.9536 0.6344 0.8684 0.0 0.8170 0.0 0.9417 0.5214 0.7398
0.2885 6.0606 200 0.3357 0.5136 0.6967 0.8989 nan 0.9082 0.0 0.9780 0.6987 0.8987 0.0 0.8389 0.0 0.9476 0.5652 0.7302
0.2754 6.6667 220 0.3052 0.5253 0.7061 0.9094 nan 0.9355 0.0 0.9714 0.6918 0.9319 0.0 0.8553 0.0 0.9577 0.5742 0.7643
0.2942 7.2727 240 0.2893 0.5226 0.6948 0.9087 nan 0.9388 0.0 0.9805 0.6373 0.9175 0.0 0.8466 0.0 0.9618 0.5504 0.7766
0.2324 7.8788 260 0.3018 0.5221 0.7053 0.9053 nan 0.9031 0.0 0.9754 0.6931 0.9549 0.0 0.8499 0.0 0.9659 0.5796 0.7370
0.2872 8.4848 280 0.2758 0.5339 0.7129 0.9139 nan 0.9160 0.0 0.9880 0.7207 0.9396 0.0 0.8585 0.0 0.9739 0.6018 0.7694
0.2076 9.0909 300 0.2615 0.5317 0.7011 0.9168 nan 0.9431 0.0 0.9920 0.6451 0.9253 0.0 0.8670 0.0 0.9715 0.5706 0.7811
0.2796 9.6970 320 0.2718 0.5226 0.6956 0.9078 nan 0.9573 0.0 0.9720 0.6536 0.8953 0.0 0.8501 0.0 0.9577 0.5505 0.7771
0.3215 10.3030 340 0.2627 0.5414 0.7207 0.9193 nan 0.9303 0.0 0.9847 0.7407 0.9476 0.0 0.8680 0.0 0.9735 0.6193 0.7878
0.209 10.9091 360 0.2516 0.5337 0.7073 0.9189 nan 0.9487 0.0 0.9898 0.6761 0.9221 0.0 0.8733 0.0 0.9734 0.5741 0.7818
0.2928 11.5152 380 0.2606 0.5457 0.7354 0.9225 nan 0.9284 0.0 0.9831 0.8203 0.9452 0.0 0.8774 0.0 0.9733 0.6401 0.7831
0.2246 12.1212 400 0.2519 0.5378 0.7107 0.9180 nan 0.9373 0.0 0.9906 0.6993 0.9263 0.0 0.8686 0.0 0.9691 0.6136 0.7755
0.2386 12.7273 420 0.2477 0.5443 0.7290 0.9212 nan 0.9407 0.0001 0.9828 0.7968 0.9245 0.0 0.8733 0.0001 0.9703 0.6294 0.7929
0.1734 13.3333 440 0.2285 0.5466 0.7234 0.9234 nan 0.9479 0.0005 0.9879 0.7531 0.9274 0.0 0.8791 0.0005 0.9747 0.6310 0.7940
0.1809 13.9394 460 0.2304 0.5501 0.7254 0.9259 nan 0.9451 0.0002 0.9923 0.7544 0.9349 0.0 0.8799 0.0002 0.9774 0.6386 0.8047
0.2162 14.5455 480 0.2431 0.5514 0.7364 0.9257 nan 0.9451 0.0022 0.9792 0.8061 0.9493 0.0 0.8825 0.0022 0.9714 0.6526 0.7996
0.1973 15.1515 500 0.2519 0.5480 0.7321 0.9188 nan 0.9306 0.0002 0.9773 0.8158 0.9364 0.0 0.8645 0.0002 0.9631 0.6636 0.7967
0.1218 15.7576 520 0.2275 0.5513 0.7291 0.9245 nan 0.9463 0.0027 0.9827 0.7704 0.9436 0.0 0.8842 0.0027 0.9712 0.6524 0.7972
0.1809 16.3636 540 0.2323 0.5540 0.7339 0.9271 nan 0.9406 0.0014 0.9876 0.7905 0.9495 0.0 0.8827 0.0014 0.9744 0.6583 0.8071
0.1484 16.9697 560 0.2364 0.5527 0.7374 0.9223 nan 0.9343 0.0027 0.9759 0.8231 0.9509 0.0 0.8694 0.0027 0.9657 0.6680 0.8105
0.1464 17.5758 580 0.2338 0.5519 0.7376 0.9254 nan 0.9397 0.0030 0.9849 0.8248 0.9354 0.0 0.8782 0.0030 0.9752 0.6523 0.8029
0.1389 18.1818 600 0.2355 0.5541 0.7350 0.9227 nan 0.9281 0.0044 0.9822 0.8067 0.9537 0.0 0.8759 0.0044 0.9722 0.6756 0.7967
0.115 18.7879 620 0.2175 0.5478 0.7175 0.9242 nan 0.9554 0.0080 0.9932 0.7125 0.9180 0.0 0.8806 0.0079 0.9735 0.6302 0.7947
0.1704 19.3939 640 0.2246 0.5552 0.7320 0.9283 nan 0.9413 0.0079 0.9919 0.7671 0.9517 0.0 0.8883 0.0078 0.9815 0.6585 0.7951
0.1537 20.0 660 0.2222 0.5590 0.7370 0.9299 nan 0.9461 0.0193 0.9919 0.7830 0.9445 0.0 0.8923 0.0189 0.9803 0.6620 0.8002
0.1605 20.6061 680 0.2174 0.5578 0.7417 0.9273 nan 0.9388 0.0084 0.9865 0.8359 0.9390 0.0 0.8833 0.0083 0.9749 0.6706 0.8096
0.1237 21.2121 700 0.2154 0.5639 0.7384 0.9312 nan 0.9496 0.0219 0.9935 0.7867 0.9404 0.0 0.8913 0.0214 0.9812 0.6799 0.8096
0.1288 21.8182 720 0.2214 0.5627 0.7430 0.9303 nan 0.9406 0.0237 0.9907 0.8101 0.9497 0.0 0.8926 0.0233 0.9799 0.6771 0.8034
0.122 22.4242 740 0.2160 0.5675 0.7469 0.9309 nan 0.9367 0.0376 0.9922 0.8161 0.9519 0.0 0.8921 0.0361 0.9816 0.6884 0.8067
0.1608 23.0303 760 0.2112 0.5613 0.7412 0.9304 nan 0.9417 0.0160 0.9934 0.8137 0.9410 0.0 0.8907 0.0157 0.9802 0.6695 0.8116
0.1258 23.6364 780 0.2230 0.5611 0.7425 0.9293 nan 0.9367 0.0196 0.9898 0.8148 0.9518 0.0 0.8894 0.0192 0.9808 0.6733 0.8036
0.1333 24.2424 800 0.2171 0.5606 0.7377 0.9280 nan 0.9420 0.0207 0.9871 0.7868 0.9520 0.0 0.8865 0.0203 0.9773 0.6734 0.8063
0.1562 24.8485 820 0.2183 0.5644 0.7477 0.9302 nan 0.9301 0.0299 0.9929 0.8333 0.9523 0.0 0.8882 0.0291 0.9813 0.6798 0.8081
0.1039 25.4545 840 0.2106 0.5648 0.7389 0.9302 nan 0.9506 0.0412 0.9920 0.7729 0.9379 0.0 0.8890 0.0392 0.9804 0.6668 0.8136
0.1079 26.0606 860 0.2099 0.5625 0.7449 0.9299 nan 0.9385 0.0233 0.9896 0.8246 0.9486 0.0 0.8917 0.0227 0.9796 0.6752 0.8056
0.0995 26.6667 880 0.2122 0.5620 0.7403 0.9310 nan 0.9464 0.0210 0.9913 0.7954 0.9474 0.0 0.8916 0.0207 0.9793 0.6690 0.8115
0.1326 27.2727 900 0.2150 0.5611 0.7428 0.9291 nan 0.9412 0.0198 0.9879 0.8193 0.9459 0.0 0.8868 0.0195 0.9785 0.6684 0.8130
0.1602 27.8788 920 0.2108 0.5645 0.7438 0.9312 nan 0.9388 0.0249 0.9925 0.8091 0.9537 0.0 0.8935 0.0243 0.9812 0.6835 0.8044
0.1758 28.4848 940 0.2116 0.5682 0.7461 0.9326 nan 0.9520 0.0347 0.9891 0.8089 0.9459 0.0 0.9004 0.0336 0.9775 0.6888 0.8089
0.3986 29.0909 960 0.1990 0.5783 0.7581 0.9314 nan 0.9414 0.1021 0.9878 0.8095 0.9496 0.0 0.8923 0.0941 0.9785 0.6951 0.8099
0.1477 29.6970 980 0.2202 0.5627 0.7442 0.9290 nan 0.9450 0.0213 0.9847 0.8258 0.9442 0.0 0.8857 0.0210 0.9749 0.6807 0.8142
0.1221 30.3030 1000 0.2288 0.5621 0.7452 0.9292 nan 0.9340 0.0226 0.9869 0.8218 0.9606 0.0 0.8922 0.0224 0.9775 0.6855 0.7950
0.1092 30.9091 1020 0.2079 0.5696 0.7507 0.9297 nan 0.9404 0.0511 0.9902 0.8393 0.9322 0.0 0.8880 0.0478 0.9802 0.6916 0.8100
0.1488 31.5152 1040 0.2147 0.5675 0.7501 0.9306 nan 0.9439 0.0550 0.9867 0.8164 0.9484 0.0 0.8939 0.0528 0.9773 0.6731 0.8078
0.1177 32.1212 1060 0.2240 0.5688 0.7470 0.9311 nan 0.9409 0.0466 0.9899 0.8028 0.9550 0.0 0.8952 0.0444 0.9801 0.6910 0.8022
0.1121 32.7273 1080 0.2043 0.5700 0.7471 0.9327 nan 0.9542 0.0467 0.9895 0.8043 0.9409 0.0 0.8949 0.0449 0.9791 0.6885 0.8129
0.1263 33.3333 1100 0.2120 0.5679 0.7516 0.9299 nan 0.9341 0.0447 0.9888 0.8415 0.9489 0.0 0.8896 0.0434 0.9785 0.6851 0.8110
0.0922 33.9394 1120 0.2104 0.5721 0.7507 0.9331 nan 0.9512 0.0605 0.9902 0.8086 0.9427 0.0 0.8984 0.0574 0.9797 0.6855 0.8118
0.1549 34.5455 1140 0.2276 0.5624 0.7428 0.9296 nan 0.9371 0.0369 0.9895 0.7912 0.9593 0.0 0.8951 0.0359 0.9800 0.6682 0.7951
0.1493 35.1515 1160 0.1981 0.5739 0.7532 0.9336 nan 0.9495 0.0723 0.9915 0.8094 0.9431 0.0 0.8946 0.0698 0.9817 0.6780 0.8197
0.1176 35.7576 1180 0.2030 0.5757 0.7552 0.9351 nan 0.9506 0.0684 0.9925 0.8196 0.9447 0.0 0.9026 0.0648 0.9820 0.6921 0.8129
0.229 36.3636 1200 0.2046 0.5730 0.7524 0.9337 nan 0.9468 0.0560 0.9917 0.8208 0.9466 0.0 0.8978 0.0539 0.9816 0.6888 0.8158
0.1419 36.9697 1220 0.2069 0.5695 0.7491 0.9322 nan 0.9410 0.0449 0.9909 0.8142 0.9546 0.0 0.8945 0.0439 0.9812 0.6880 0.8093
0.0725 37.5758 1240 0.2001 0.5724 0.7554 0.9322 nan 0.9400 0.0641 0.9916 0.8397 0.9418 0.0 0.8937 0.0612 0.9806 0.6830 0.8157
0.0653 38.1818 1260 0.2039 0.5729 0.7530 0.9344 nan 0.9474 0.0577 0.9916 0.8173 0.9510 0.0 0.8999 0.0561 0.9823 0.6843 0.8151
0.1446 38.7879 1280 0.2051 0.5729 0.7554 0.9325 nan 0.9349 0.0637 0.9926 0.8325 0.9531 0.0 0.8944 0.0619 0.9823 0.6853 0.8138
0.1172 39.3939 1300 0.2099 0.5719 0.7498 0.9320 nan 0.9499 0.0707 0.9881 0.7914 0.9486 0.0 0.8959 0.0674 0.9797 0.6778 0.8108
0.0907 40.0 1320 0.2099 0.5691 0.7500 0.9323 nan 0.9445 0.0536 0.9894 0.8095 0.9528 0.0 0.8966 0.0520 0.9803 0.6778 0.8078
0.1174 40.6061 1340 0.2221 0.5677 0.7461 0.9332 nan 0.9491 0.0481 0.9913 0.7887 0.9535 0.0 0.9008 0.0464 0.9819 0.6707 0.8064
0.1053 41.2121 1360 0.2092 0.5699 0.7493 0.9320 nan 0.9447 0.0571 0.9904 0.8045 0.9496 0.0 0.8970 0.0545 0.9810 0.6814 0.8056
0.1026 41.8182 1380 0.2012 0.5752 0.7541 0.9338 nan 0.9489 0.0759 0.9926 0.8129 0.9404 0.0 0.8954 0.0720 0.9819 0.6856 0.8164
0.1371 42.4242 1400 0.1945 0.5783 0.7575 0.9336 nan 0.9466 0.0953 0.9927 0.8133 0.9399 0.0 0.8959 0.0889 0.9823 0.6863 0.8162
0.0799 43.0303 1420 0.2107 0.5712 0.7508 0.9332 nan 0.9492 0.0619 0.9902 0.8045 0.9483 0.0 0.8972 0.0595 0.9816 0.6766 0.8124
0.1458 43.6364 1440 0.1906 0.5788 0.7577 0.9339 nan 0.9489 0.0892 0.9909 0.8185 0.9410 0.0 0.8943 0.0843 0.9805 0.6925 0.8213
0.112 44.2424 1460 0.2091 0.5726 0.7541 0.9329 nan 0.9418 0.0579 0.9901 0.8288 0.9518 0.0 0.8953 0.0560 0.9802 0.6893 0.8149
0.1311 44.8485 1480 0.2022 0.5730 0.7519 0.9326 nan 0.9512 0.0656 0.9888 0.8131 0.9408 0.0 0.8959 0.0622 0.9791 0.6862 0.8146
0.1474 45.4545 1500 0.2001 0.5743 0.7578 0.9328 nan 0.9410 0.0698 0.9908 0.8441 0.9436 0.0 0.8939 0.0671 0.9805 0.6887 0.8157
0.0953 46.0606 1520 0.2072 0.5764 0.7568 0.9338 nan 0.9420 0.0672 0.9906 0.8310 0.9530 0.0 0.8954 0.0651 0.9815 0.7000 0.8167
0.1038 46.6667 1540 0.2003 0.5760 0.7544 0.9339 nan 0.9465 0.0760 0.9921 0.8092 0.9480 0.0 0.8963 0.0728 0.9824 0.6886 0.8157
0.1362 47.2727 1560 0.1978 0.5755 0.7533 0.9327 nan 0.9482 0.0793 0.9908 0.8065 0.9416 0.0 0.8950 0.0751 0.9804 0.6895 0.8130
0.1052 47.8788 1580 0.2080 0.5750 0.7526 0.9336 nan 0.9484 0.0676 0.9896 0.8045 0.9528 0.0 0.8969 0.0653 0.9802 0.6947 0.8129
0.0813 48.4848 1600 0.2054 0.5744 0.7548 0.9338 nan 0.9439 0.0579 0.9919 0.8332 0.9471 0.0 0.8961 0.0559 0.9819 0.6965 0.8163
0.0992 49.0909 1620 0.2016 0.5765 0.7563 0.9344 nan 0.9467 0.0730 0.9922 0.8243 0.9455 0.0 0.8952 0.0702 0.9826 0.6912 0.8200
0.1604 49.6970 1640 0.1941 0.5755 0.7536 0.9340 nan 0.9537 0.0749 0.9909 0.8095 0.9391 0.0 0.8956 0.0717 0.9803 0.6885 0.8168

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
15
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for chugz/segformer-b0-practice-7-11

Base model

nvidia/mit-b0
Finetuned
this model