Edit model card

segformer-b5-finetuned-segments-metalchip-4-Mar-10epochs

This model is a fine-tuned version of nvidia/mit-b5 on the segments/metalchip-semantic dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1278
  • Mean Iou: 0.9007
  • Mean Accuracy: 0.9476
  • Overall Accuracy: 0.9478
  • Accuracy Background: 0.9538
  • Accuracy Metal lines: 0.9414
  • Iou Background: 0.9047
  • Iou Metal lines: 0.8966

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.01
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Metal lines Iou Background Iou Metal lines
0.2928 0.25 20 0.5604 0.7372 0.8474 0.8531 0.9962 0.6985 0.7788 0.6957
0.234 0.51 40 0.4095 0.5873 0.7566 0.7479 0.5316 0.9815 0.5227 0.6518
0.2606 0.76 60 0.1990 0.8682 0.9282 0.9298 0.9703 0.8861 0.8777 0.8586
0.1575 1.01 80 0.1853 0.8604 0.9238 0.9254 0.9670 0.8806 0.8706 0.8502
0.155 1.27 100 0.3476 0.7775 0.8728 0.8775 0.9935 0.7522 0.8080 0.7469
0.1565 1.52 120 0.1813 0.8615 0.9243 0.9260 0.9689 0.8798 0.8718 0.8512
0.1773 1.77 140 0.1656 0.8830 0.9382 0.9379 0.9297 0.9467 0.8860 0.8799
0.1407 2.03 160 0.1653 0.8841 0.9378 0.9387 0.9629 0.9126 0.8908 0.8775
0.1692 2.28 180 0.1726 0.8720 0.9331 0.9316 0.8949 0.9712 0.8717 0.8723
0.1525 2.53 200 0.1415 0.8924 0.9434 0.9432 0.9383 0.9484 0.8955 0.8892
0.1434 2.78 220 0.3651 0.7260 0.8487 0.8431 0.7012 0.9963 0.6988 0.7533
0.1701 3.04 240 0.1505 0.8848 0.9396 0.9389 0.9216 0.9576 0.8868 0.8829
0.093 3.29 260 0.1391 0.8931 0.9438 0.9436 0.9370 0.9507 0.8961 0.8901
0.1331 3.54 280 0.1363 0.8936 0.9440 0.9439 0.9395 0.9486 0.8968 0.8904
0.124 3.8 300 0.1329 0.8960 0.9451 0.9452 0.9484 0.9418 0.8999 0.8921
0.144 4.05 320 0.1461 0.8847 0.9381 0.9391 0.9642 0.9120 0.8915 0.8780
0.1226 4.3 340 0.1463 0.8884 0.9416 0.9409 0.9239 0.9592 0.8903 0.8864
0.1344 4.56 360 0.1323 0.8986 0.9467 0.9467 0.9453 0.9482 0.9020 0.8953
0.1497 4.81 380 0.1382 0.8929 0.9435 0.9435 0.9418 0.9453 0.8964 0.8894
0.146 5.06 400 0.1382 0.8938 0.9434 0.9441 0.9600 0.9268 0.8991 0.8885
0.1046 5.32 420 0.1449 0.8899 0.9419 0.9418 0.9401 0.9436 0.8934 0.8863
0.1355 5.57 440 0.1378 0.8949 0.9448 0.9446 0.9386 0.9511 0.8979 0.8919
0.1268 5.82 460 0.1334 0.8976 0.9461 0.9461 0.9449 0.9473 0.9010 0.8941
0.1146 6.08 480 0.1312 0.8990 0.9468 0.9469 0.9500 0.9435 0.9028 0.8952
0.13 6.33 500 0.1315 0.9000 0.9474 0.9475 0.9490 0.9457 0.9036 0.8964
0.1196 6.58 520 0.1325 0.8981 0.9466 0.9464 0.9401 0.9531 0.9010 0.8952
0.1111 6.84 540 0.1298 0.8987 0.9465 0.9467 0.9507 0.9424 0.9025 0.8948
0.095 7.09 560 0.1351 0.8990 0.9465 0.9469 0.9593 0.9336 0.9037 0.8943
0.1077 7.34 580 0.1316 0.8990 0.9467 0.9469 0.9530 0.9404 0.9031 0.8950
0.1246 7.59 600 0.1416 0.8896 0.9414 0.9417 0.9502 0.9325 0.8943 0.8849
0.1108 7.85 620 0.1406 0.8907 0.9413 0.9424 0.9711 0.9115 0.8975 0.8839
0.148 8.1 640 0.1337 0.8949 0.9441 0.9447 0.9591 0.9291 0.9000 0.8898
0.1056 8.35 660 0.1290 0.9007 0.9474 0.9478 0.9581 0.9368 0.9051 0.8962
0.1003 8.61 680 0.1291 0.8972 0.9458 0.9459 0.9486 0.9429 0.9010 0.8934
0.1089 8.86 700 0.1295 0.9008 0.9476 0.9479 0.9560 0.9392 0.9050 0.8966
0.0924 9.11 720 0.1297 0.9004 0.9476 0.9477 0.9500 0.9451 0.9041 0.8967
0.0851 9.37 740 0.1406 0.8937 0.9430 0.9440 0.9694 0.9166 0.9000 0.8873
0.1223 9.62 760 0.1341 0.8983 0.9461 0.9466 0.9593 0.9329 0.9031 0.8936
0.1592 9.87 780 0.1306 0.8985 0.9464 0.9466 0.9522 0.9406 0.9025 0.8944
0.1344 10.13 800 0.1293 0.8991 0.9469 0.9470 0.9483 0.9455 0.9028 0.8955
0.1216 10.38 820 0.1309 0.8982 0.9463 0.9465 0.9521 0.9404 0.9023 0.8942
0.1086 10.63 840 0.1327 0.8981 0.9462 0.9464 0.9531 0.9393 0.9023 0.8940
0.1199 10.89 860 0.1323 0.8990 0.9465 0.9469 0.9580 0.9350 0.9036 0.8944
0.1148 11.14 880 0.1290 0.9005 0.9477 0.9477 0.9479 0.9475 0.9040 0.8970
0.1368 11.39 900 0.1298 0.9000 0.9471 0.9475 0.9571 0.9370 0.9044 0.8955
0.1289 11.65 920 0.1335 0.8952 0.9447 0.9447 0.9448 0.9447 0.8988 0.8916
0.1181 11.9 940 0.1305 0.8997 0.9471 0.9473 0.9524 0.9418 0.9037 0.8957
0.1738 12.15 960 0.1278 0.9013 0.9480 0.9482 0.9519 0.9441 0.9051 0.8976
0.1269 12.41 980 0.1285 0.9000 0.9472 0.9475 0.9546 0.9398 0.9042 0.8959
0.1118 12.66 1000 0.1323 0.8967 0.9450 0.9457 0.9636 0.9263 0.9021 0.8913
0.0688 12.91 1020 0.1302 0.8998 0.9472 0.9473 0.9503 0.9441 0.9035 0.8960
0.1245 13.16 1040 0.1295 0.9002 0.9474 0.9476 0.9524 0.9423 0.9041 0.8963
0.0838 13.42 1060 0.1303 0.9007 0.9474 0.9479 0.9603 0.9345 0.9054 0.8961
0.1103 13.67 1080 0.1285 0.8994 0.9469 0.9471 0.9519 0.9419 0.9033 0.8954
0.1204 13.92 1100 0.1326 0.8988 0.9468 0.9468 0.9463 0.9474 0.9023 0.8954
0.0797 14.18 1120 0.1301 0.8989 0.9462 0.9469 0.9634 0.9291 0.9040 0.8937
0.1192 14.43 1140 0.1283 0.8998 0.9470 0.9474 0.9560 0.9381 0.9042 0.8955
0.1713 14.68 1160 0.1282 0.9015 0.9479 0.9483 0.9577 0.9381 0.9058 0.8971
0.1032 14.94 1180 0.1340 0.8957 0.9445 0.9451 0.9624 0.9265 0.9011 0.8904
0.1097 15.19 1200 0.1300 0.9003 0.9471 0.9476 0.9613 0.9329 0.9051 0.8955
0.194 15.44 1220 0.1269 0.9014 0.9479 0.9482 0.9568 0.9389 0.9056 0.8971
0.0849 15.7 1240 0.1288 0.8997 0.9468 0.9473 0.9609 0.9327 0.9045 0.8949
0.105 15.95 1260 0.1293 0.9003 0.9472 0.9476 0.9590 0.9353 0.9048 0.8957
0.1102 16.2 1280 0.1268 0.9008 0.9478 0.9479 0.9513 0.9442 0.9046 0.8971
0.1185 16.46 1300 0.1301 0.8998 0.9471 0.9473 0.9546 0.9395 0.9040 0.8956
0.1087 16.71 1320 0.1280 0.9001 0.9473 0.9475 0.9512 0.9435 0.9039 0.8963
0.1033 16.96 1340 0.1281 0.9002 0.9474 0.9475 0.9501 0.9447 0.9039 0.8965
0.1083 17.22 1360 0.1270 0.9010 0.9477 0.9480 0.9549 0.9406 0.9051 0.8969
0.124 17.47 1380 0.1285 0.9007 0.9476 0.9478 0.9536 0.9416 0.9047 0.8967
0.0835 17.72 1400 0.1288 0.8996 0.9468 0.9472 0.9583 0.9353 0.9041 0.8950
0.1263 17.97 1420 0.1274 0.9012 0.9477 0.9482 0.9584 0.9371 0.9056 0.8968
0.1073 18.23 1440 0.1274 0.9008 0.9476 0.9479 0.9562 0.9389 0.9050 0.8965
0.1323 18.48 1460 0.1282 0.9001 0.9475 0.9475 0.9488 0.9461 0.9037 0.8965
0.0845 18.73 1480 0.1272 0.9007 0.9475 0.9479 0.9556 0.9394 0.9049 0.8965
0.1131 18.99 1500 0.1279 0.9008 0.9476 0.9479 0.9545 0.9407 0.9048 0.8967
0.1198 19.24 1520 0.1306 0.8987 0.9466 0.9467 0.9492 0.9441 0.9024 0.8950
0.119 19.49 1540 0.1275 0.9009 0.9476 0.9480 0.9565 0.9388 0.9052 0.8966
0.0984 19.75 1560 0.1275 0.9007 0.9477 0.9478 0.9519 0.9434 0.9045 0.8969
0.0955 20.0 1580 0.1278 0.9007 0.9476 0.9478 0.9538 0.9414 0.9047 0.8966

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
7
Safetensors
Model size
84.6M params
Tensor type
F32
·

Finetuned from