Edit model card

segformer-b0_DsMetalDam_Augmented_Cropped

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2486
  • Mean Iou: 0.6867
  • Mean Accuracy: 0.7623
  • Overall Accuracy: 0.9106
  • Accuracy Matrix: 0.8910
  • Accuracy Austenite: 0.9442
  • Accuracy Martensite/austenite: 0.8061
  • Accuracy Precipitate: 0.2109
  • Accuracy Defect: 0.9591
  • Iou Matrix: 0.8022
  • Iou Austenite: 0.8886
  • Iou Martensite/austenite: 0.6946
  • Iou Precipitate: 0.1697
  • Iou Defect: 0.8786

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Matrix Accuracy Austenite Accuracy Martensite/austenite Accuracy Precipitate Accuracy Defect Iou Matrix Iou Austenite Iou Martensite/austenite Iou Precipitate Iou Defect
0.2546 1.0 343 0.3220 0.5965 0.6868 0.8757 0.8517 0.9218 0.7201 0.0 0.9404 0.7384 0.8585 0.5502 0.0 0.8353
0.336 2.0 686 0.3159 0.5992 0.6766 0.8807 0.8816 0.9295 0.6220 0.0 0.9497 0.7474 0.8627 0.5429 0.0 0.8428
0.2976 3.0 1029 0.3087 0.6057 0.6971 0.8807 0.8383 0.9325 0.7561 0.0000 0.9583 0.7412 0.8629 0.5833 0.0000 0.8411
0.2791 4.0 1372 0.2907 0.6175 0.6995 0.8886 0.8717 0.9290 0.7401 0.0016 0.9548 0.7608 0.8674 0.6070 0.0016 0.8507
0.2795 5.0 1715 0.2883 0.6264 0.7025 0.8906 0.8675 0.9369 0.7303 0.0291 0.9489 0.7630 0.8689 0.6135 0.0283 0.8584
0.2215 6.0 2058 0.2845 0.6316 0.7081 0.8924 0.8873 0.9252 0.7457 0.0452 0.9373 0.7700 0.8700 0.6212 0.0431 0.8536
0.2372 7.0 2401 0.2770 0.6343 0.7197 0.8931 0.8565 0.9373 0.7906 0.0492 0.9651 0.7657 0.8715 0.6365 0.0472 0.8504
0.3055 8.0 2744 0.2742 0.6337 0.7201 0.8950 0.8835 0.9220 0.8026 0.0324 0.9603 0.7742 0.8728 0.6413 0.0317 0.8482
0.2047 9.0 3087 0.2680 0.6497 0.7251 0.8982 0.8733 0.9384 0.7786 0.0884 0.9468 0.7765 0.8766 0.6500 0.0819 0.8634
0.1705 10.0 3430 0.2675 0.6489 0.7328 0.8987 0.8744 0.9336 0.8043 0.0862 0.9654 0.7793 0.8767 0.6531 0.0802 0.8550
0.2029 11.0 3773 0.2685 0.6523 0.7267 0.9003 0.8751 0.9443 0.7596 0.0958 0.9589 0.7812 0.8779 0.6536 0.0890 0.8600
0.1707 12.0 4116 0.2612 0.6591 0.7360 0.9015 0.8866 0.9324 0.7982 0.1097 0.9532 0.7853 0.8788 0.6639 0.0995 0.8679
0.2742 13.0 4459 0.2628 0.6512 0.7247 0.9022 0.8756 0.9442 0.7781 0.0666 0.9593 0.7847 0.8797 0.6635 0.0633 0.8651
0.2991 14.0 4802 0.2702 0.6653 0.7404 0.9025 0.8909 0.9368 0.7673 0.1492 0.9578 0.7870 0.8799 0.6627 0.1247 0.8722
0.229 15.0 5145 0.2599 0.6615 0.7395 0.9026 0.8723 0.9463 0.7800 0.1303 0.9687 0.7850 0.8798 0.6682 0.1143 0.8604
0.2004 16.0 5488 0.2595 0.6719 0.7473 0.9042 0.8854 0.9398 0.7863 0.1735 0.9513 0.7898 0.8814 0.6719 0.1442 0.8721
0.1944 17.0 5831 0.2564 0.6729 0.7486 0.9058 0.8940 0.9368 0.7895 0.1693 0.9536 0.7936 0.8830 0.6778 0.1418 0.8685
0.2068 18.0 6174 0.2539 0.6664 0.7450 0.9061 0.8915 0.9362 0.8051 0.1245 0.9677 0.7940 0.8839 0.6801 0.1102 0.8641
0.2461 19.0 6517 0.2494 0.6776 0.7603 0.9063 0.8756 0.9427 0.8251 0.1941 0.9642 0.7927 0.8854 0.6800 0.1585 0.8712
0.2252 20.0 6860 0.2498 0.6733 0.7461 0.9074 0.8813 0.9452 0.8043 0.1456 0.9542 0.7947 0.8856 0.6843 0.1284 0.8736
0.1975 21.0 7203 0.2519 0.6761 0.7516 0.9084 0.8960 0.9386 0.7992 0.1656 0.9585 0.7989 0.8861 0.6862 0.1412 0.8679
0.2356 22.0 7546 0.2506 0.6801 0.7526 0.9087 0.8956 0.9396 0.7972 0.1764 0.9542 0.7991 0.8858 0.6890 0.1486 0.8779
0.1838 23.0 7889 0.2510 0.6805 0.7554 0.9088 0.8835 0.9455 0.8068 0.1824 0.9589 0.7978 0.8867 0.6892 0.1516 0.8773
0.1576 24.0 8232 0.2511 0.6850 0.7658 0.9091 0.8913 0.9418 0.8021 0.2291 0.9650 0.7996 0.8868 0.6891 0.1765 0.8731
0.1504 25.0 8575 0.2505 0.6819 0.7590 0.9092 0.8869 0.9439 0.8077 0.1916 0.9650 0.7992 0.8873 0.6890 0.1587 0.8751
0.2196 26.0 8918 0.2530 0.6830 0.7597 0.9095 0.8946 0.9405 0.8035 0.1985 0.9612 0.8010 0.8872 0.6900 0.1610 0.8756
0.1781 27.0 9261 0.2509 0.6841 0.7596 0.9101 0.8901 0.9451 0.7993 0.2000 0.9635 0.8010 0.8880 0.6930 0.1635 0.8749
0.1578 28.0 9604 0.2485 0.6831 0.7591 0.9102 0.8874 0.9457 0.8064 0.1912 0.9651 0.8008 0.8882 0.6942 0.1585 0.8740
0.1931 29.0 9947 0.2495 0.6840 0.7579 0.9105 0.8893 0.9454 0.8042 0.1899 0.9604 0.8016 0.8884 0.6940 0.1580 0.8779
0.1582 30.0 10290 0.2486 0.6867 0.7623 0.9106 0.8910 0.9442 0.8061 0.2109 0.9591 0.8022 0.8886 0.6946 0.1697 0.8786

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
0
Unable to determine this model’s pipeline type. Check the docs .

Finetuned from