Edit model card

segformerSAAD2

This model is a fine-tuned version of nvidia/mit-b0 on the Saad287/SixGUN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0658
  • Mean Iou: 0.8885
  • Mean Accuracy: 0.9302
  • Overall Accuracy: 0.9934
  • Accuracy Bkg: 0.9977
  • Accuracy Knife: 0.8700
  • Accuracy Gun: 0.9228
  • Iou Bkg: 0.9940
  • Iou Knife: 0.8345
  • Iou Gun: 0.8370

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 23
  • eval_batch_size: 23
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 500

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Bkg Accuracy Knife Accuracy Gun Iou Bkg Iou Knife Iou Gun
0.9085 10.0 10 1.0050 0.5155 0.9279 0.9160 0.9152 0.9126 0.9559 0.9144 0.3135 0.3185
0.7013 20.0 20 0.8010 0.6030 0.8080 0.9590 0.9692 0.6715 0.7832 0.9590 0.4442 0.4060
0.5991 30.0 30 0.6384 0.6286 0.7939 0.9648 0.9765 0.6946 0.7107 0.9647 0.5156 0.4054
0.5105 40.0 40 0.5393 0.6328 0.8246 0.9632 0.9727 0.7618 0.7393 0.9631 0.5383 0.3970
0.4642 50.0 50 0.4625 0.6478 0.8186 0.9668 0.9769 0.7866 0.6922 0.9666 0.5662 0.4108
0.4301 60.0 60 0.4175 0.6580 0.8092 0.9696 0.9805 0.7819 0.6650 0.9693 0.5757 0.4291
0.3623 70.0 70 0.3713 0.7008 0.8249 0.9758 0.9861 0.7827 0.7061 0.9757 0.6257 0.5010
0.3356 80.0 80 0.3247 0.7648 0.8740 0.9825 0.9899 0.8066 0.8254 0.9827 0.6695 0.6422
0.2975 90.0 90 0.2862 0.7936 0.8754 0.9857 0.9932 0.8001 0.8330 0.9860 0.6986 0.6962
0.2722 100.0 100 0.2656 0.8099 0.8962 0.9869 0.9931 0.8202 0.8753 0.9873 0.7126 0.7296
0.2465 110.0 110 0.2377 0.8192 0.8904 0.9880 0.9946 0.8170 0.8597 0.9884 0.7251 0.7440
0.2221 120.0 120 0.2163 0.8311 0.9018 0.9889 0.9949 0.8265 0.8840 0.9895 0.7391 0.7648
0.2082 130.0 130 0.2007 0.8341 0.8971 0.9892 0.9955 0.8297 0.8662 0.9896 0.7500 0.7628
0.1974 140.0 140 0.1928 0.8462 0.9026 0.9901 0.9961 0.8390 0.8729 0.9906 0.7685 0.7795
0.1823 150.0 150 0.1720 0.8464 0.8974 0.9903 0.9966 0.8324 0.8632 0.9907 0.7710 0.7775
0.1743 160.0 160 0.1598 0.8533 0.9014 0.9908 0.9969 0.8307 0.8766 0.9913 0.7799 0.7888
0.1577 170.0 170 0.1560 0.8592 0.9150 0.9911 0.9963 0.8514 0.8973 0.9917 0.7892 0.7967
0.1468 180.0 180 0.1432 0.8632 0.9079 0.9915 0.9972 0.8406 0.8858 0.9920 0.7964 0.8012
0.1397 190.0 190 0.1323 0.8641 0.9128 0.9915 0.9969 0.8503 0.8911 0.9920 0.7993 0.8009
0.1305 200.0 200 0.1264 0.8659 0.9102 0.9917 0.9972 0.8468 0.8867 0.9922 0.8024 0.8032
0.1228 210.0 210 0.1209 0.8730 0.9188 0.9922 0.9972 0.8552 0.9038 0.9927 0.8119 0.8144
0.1179 220.0 220 0.1123 0.8730 0.9172 0.9922 0.9973 0.8527 0.9016 0.9928 0.8128 0.8133
0.1126 230.0 230 0.1078 0.8742 0.9203 0.9923 0.9972 0.8568 0.9069 0.9929 0.8148 0.8149
0.1066 240.0 240 0.1029 0.8758 0.9216 0.9924 0.9972 0.8587 0.9090 0.9930 0.8176 0.8169
0.1026 250.0 250 0.0978 0.8786 0.9224 0.9926 0.9974 0.8648 0.9050 0.9931 0.8230 0.8197
0.1012 260.0 260 0.0963 0.8797 0.9271 0.9927 0.9972 0.8659 0.9182 0.9933 0.8242 0.8217
0.0952 270.0 270 0.0926 0.8800 0.9229 0.9927 0.9975 0.8607 0.9104 0.9933 0.8250 0.8218
0.0938 280.0 280 0.0895 0.8819 0.9238 0.9929 0.9975 0.8649 0.9090 0.9934 0.8284 0.8237
0.0893 290.0 290 0.0860 0.8832 0.9260 0.9930 0.9975 0.8686 0.9120 0.9935 0.8301 0.8261
0.0868 300.0 300 0.0835 0.8831 0.9243 0.9930 0.9976 0.8646 0.9106 0.9935 0.8301 0.8257
0.0853 310.0 310 0.0820 0.8834 0.9257 0.9930 0.9975 0.8664 0.9132 0.9936 0.8307 0.8260
0.0817 320.0 320 0.0786 0.8867 0.9291 0.9932 0.9976 0.8707 0.9190 0.9938 0.8343 0.8319
0.0823 330.0 330 0.0785 0.8851 0.9261 0.9931 0.9977 0.8677 0.9129 0.9937 0.8326 0.8291
0.0858 340.0 340 0.0768 0.8860 0.9272 0.9932 0.9976 0.8680 0.9159 0.9938 0.8330 0.8312
0.078 350.0 350 0.0744 0.8853 0.9253 0.9931 0.9977 0.8648 0.9133 0.9937 0.8315 0.8308
0.0755 360.0 360 0.0739 0.8872 0.9290 0.9933 0.9976 0.8694 0.9201 0.9939 0.8337 0.8342
0.0752 370.0 370 0.0727 0.8880 0.9307 0.9933 0.9976 0.8721 0.9224 0.9939 0.8351 0.8349
0.0727 380.0 380 0.0719 0.8858 0.9269 0.9932 0.9977 0.8665 0.9166 0.9938 0.8318 0.8318
0.0725 390.0 390 0.0709 0.8873 0.9298 0.9933 0.9976 0.8705 0.9215 0.9939 0.8347 0.8333
0.0725 400.0 400 0.0695 0.8870 0.9283 0.9933 0.9977 0.8686 0.9188 0.9939 0.8331 0.8340
0.0718 410.0 410 0.0689 0.8890 0.9309 0.9934 0.9976 0.8706 0.9244 0.9940 0.8351 0.8378
0.0695 420.0 420 0.0679 0.8872 0.9284 0.9933 0.9977 0.8689 0.9186 0.9939 0.8335 0.8341
0.0707 430.0 430 0.0676 0.8881 0.9307 0.9933 0.9976 0.8686 0.9258 0.9940 0.8334 0.8370
0.0684 440.0 440 0.0668 0.8883 0.9298 0.9934 0.9977 0.8693 0.9225 0.9940 0.8339 0.8372
0.0692 450.0 450 0.0668 0.8887 0.9310 0.9934 0.9976 0.8708 0.9245 0.9940 0.8350 0.8372
0.0681 460.0 460 0.0661 0.8889 0.9296 0.9934 0.9977 0.8696 0.9216 0.9940 0.8344 0.8382
0.0688 470.0 470 0.0661 0.8888 0.9305 0.9934 0.9977 0.8708 0.9230 0.9940 0.8352 0.8372
0.0675 480.0 480 0.0659 0.8888 0.9307 0.9934 0.9976 0.8717 0.9228 0.9940 0.8353 0.8370
0.0679 490.0 490 0.0659 0.8886 0.9300 0.9934 0.9977 0.8708 0.9214 0.9940 0.8352 0.8367
0.0668 500.0 500 0.0658 0.8885 0.9302 0.9934 0.9977 0.8700 0.9228 0.9940 0.8345 0.8370

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for Saad287/segformerSAAD2

Base model

nvidia/mit-b0
Finetuned
(310)
this model