Edit model card

segformer-b3-finetuned-segments-outputs

This model is a fine-tuned version of nvidia/mit-b3 on the unreal-hug/REAL_DATASET_SEG_401_6_lbls dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3002
  • Mean Iou: 0.2829
  • Mean Accuracy: 0.3326
  • Overall Accuracy: 0.6026
  • Accuracy Unlabeled: nan
  • Accuracy Lv: 0.7852
  • Accuracy Rv: 0.5699
  • Accuracy Ra: 0.5380
  • Accuracy La: 0.6208
  • Accuracy Vs: 0.0
  • Accuracy As: 0.0
  • Accuracy Mk: 0.0004
  • Accuracy Tk: nan
  • Accuracy Asd: 0.1783
  • Accuracy Vsd: 0.1873
  • Accuracy Ak: 0.4458
  • Iou Unlabeled: 0.0
  • Iou Lv: 0.7310
  • Iou Rv: 0.5182
  • Iou Ra: 0.5178
  • Iou La: 0.5526
  • Iou Vs: 0.0
  • Iou As: 0.0
  • Iou Mk: 0.0004
  • Iou Tk: nan
  • Iou Asd: 0.1728
  • Iou Vsd: 0.1827
  • Iou Ak: 0.4361

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Lv Accuracy Rv Accuracy Ra Accuracy La Accuracy Vs Accuracy As Accuracy Mk Accuracy Tk Accuracy Asd Accuracy Vsd Accuracy Ak Iou Unlabeled Iou Lv Iou Rv Iou Ra Iou La Iou Vs Iou As Iou Mk Iou Tk Iou Asd Iou Vsd Iou Ak
0.4253 0.62 100 0.4599 0.1588 0.2152 0.4965 nan 0.8879 0.0778 0.1004 0.5637 0.0 0.0 0.0 nan 0.0120 0.0509 0.4590 0.0 0.6799 0.0770 0.0985 0.3899 0.0 0.0 0.0 nan 0.0120 0.0509 0.4386
0.3839 1.25 200 0.3598 0.2325 0.2929 0.5740 nan 0.8720 0.4761 0.6272 0.2194 0.0 0.0 0.0 nan 0.0102 0.2038 0.5201 0.0 0.8020 0.4259 0.4142 0.2085 0.0 0.0 0.0 nan 0.0102 0.1964 0.4999
0.4634 1.88 300 0.3361 0.3031 0.3870 0.6197 nan 0.7362 0.7347 0.2986 0.7550 0.0 0.0 0.0 nan 0.3070 0.4629 0.5752 0.0 0.6984 0.5947 0.2894 0.5089 0.0 0.0 0.0 nan 0.2756 0.4265 0.5410
0.147 2.5 400 0.3123 0.3081 0.3772 0.5772 nan 0.6525 0.4740 0.6282 0.5966 0.0 0.0 0.0002 nan 0.2846 0.5934 0.5425 0.0 0.6202 0.4429 0.5296 0.5133 0.0 0.0 0.0002 nan 0.2597 0.5196 0.5033
0.2044 3.12 500 0.3104 0.2918 0.3459 0.5719 nan 0.7327 0.5989 0.5243 0.4087 0.0 0.0 0.0046 nan 0.0585 0.5632 0.5678 0.0 0.6887 0.5466 0.4931 0.3770 0.0 0.0 0.0045 nan 0.0583 0.4945 0.5471
0.3223 3.75 600 0.3078 0.3341 0.4038 0.6417 nan 0.6870 0.5831 0.7323 0.7609 0.0019 0.0 0.0267 nan 0.2290 0.4286 0.5887 0.0 0.6482 0.5377 0.6608 0.6435 0.0019 0.0 0.0255 nan 0.2199 0.3893 0.5488
0.275 4.38 700 0.3081 0.3007 0.3562 0.5801 nan 0.7267 0.3140 0.5325 0.6536 0.0024 0.0 0.0 nan 0.2228 0.5105 0.5992 0.0 0.6833 0.2982 0.5065 0.5827 0.0024 0.0 0.0 nan 0.2110 0.4492 0.5741
0.2679 5.0 800 0.3002 0.2829 0.3326 0.6026 nan 0.7852 0.5699 0.5380 0.6208 0.0 0.0 0.0004 nan 0.1783 0.1873 0.4458 0.0 0.7310 0.5182 0.5178 0.5526 0.0 0.0 0.0004 nan 0.1728 0.1827 0.4361
0.3721 5.62 900 0.3100 0.3449 0.4111 0.6774 nan 0.8066 0.6839 0.6907 0.6722 0.0004 0.0 0.0002 nan 0.2097 0.5078 0.5401 0.0 0.7558 0.6115 0.6389 0.6063 0.0004 0.0 0.0002 nan 0.2043 0.4613 0.5147
0.2418 6.25 1000 0.3161 0.3769 0.4608 0.7076 nan 0.7978 0.6939 0.6991 0.7553 0.1402 0.0 0.0 nan 0.2148 0.6464 0.6604 0.0 0.7465 0.6110 0.6455 0.6508 0.1308 0.0 0.0 nan 0.2046 0.5357 0.6210
0.5517 6.88 1100 0.3622 0.1738 0.2011 0.3603 nan 0.5002 0.2451 0.4020 0.3224 0.0287 0.0 0.0143 nan 0.1725 0.1302 0.1956 0.0 0.4829 0.2368 0.3610 0.3027 0.0279 0.0 0.0139 nan 0.1660 0.1262 0.1944
0.2611 7.5 1200 0.3240 0.3572 0.4346 0.6530 nan 0.7703 0.6570 0.6721 0.5853 0.1717 0.0 0.0561 nan 0.3176 0.5672 0.5490 0.0 0.7190 0.5879 0.5838 0.5265 0.1576 0.0 0.0520 nan 0.2832 0.4852 0.5341
0.2422 8.12 1300 0.3206 0.3382 0.4095 0.6283 nan 0.7598 0.5413 0.6799 0.5747 0.1393 0.0 0.1071 nan 0.2918 0.4583 0.5432 0.0 0.7139 0.4894 0.5792 0.5128 0.1306 0.0 0.0900 nan 0.2601 0.4134 0.5313
0.2 8.75 1400 0.3110 0.3299 0.3976 0.5977 nan 0.6984 0.4791 0.6668 0.6132 0.2240 0.0 0.0000 nan 0.3035 0.4994 0.4917 0.0 0.6626 0.4281 0.5904 0.5516 0.2026 0.0 0.0000 nan 0.2767 0.4409 0.4754
0.1095 9.38 1500 0.3375 0.2732 0.3235 0.5205 nan 0.5957 0.4483 0.5939 0.5724 0.1094 0.0 0.0005 nan 0.2502 0.2124 0.4518 0.0 0.5689 0.4004 0.5432 0.5122 0.1038 0.0 0.0005 nan 0.2293 0.2068 0.4398
0.2373 10.0 1600 0.3453 0.3066 0.3658 0.5723 nan 0.6940 0.5507 0.5989 0.5415 0.2547 0.0 0.0 nan 0.1549 0.4018 0.4611 0.0 0.6560 0.5007 0.5402 0.4888 0.2231 0.0 0.0 nan 0.1519 0.3680 0.4442
0.0756 10.62 1700 0.3413 0.3699 0.4457 0.6868 nan 0.7934 0.6758 0.6577 0.7146 0.2091 0.0 0.0075 nan 0.2043 0.5427 0.6520 0.0 0.7465 0.6060 0.6120 0.6207 0.1863 0.0 0.0071 nan 0.1908 0.4923 0.6071
0.1072 11.25 1800 0.3736 0.2889 0.3434 0.5518 nan 0.6798 0.5118 0.6135 0.5297 0.1772 0.0 0.0195 nan 0.1954 0.3432 0.3636 0.0 0.6444 0.4854 0.5561 0.4786 0.1539 0.0 0.0183 nan 0.1788 0.3106 0.3523
0.1216 11.88 1900 0.3648 0.3248 0.3879 0.6056 nan 0.7039 0.5606 0.6138 0.6566 0.1644 0.0 0.0080 nan 0.2637 0.3991 0.5087 0.0 0.6665 0.5153 0.5725 0.5704 0.1453 0.0 0.0074 nan 0.2402 0.3677 0.4877
0.1401 12.5 2000 0.3436 0.3537 0.4292 0.6524 nan 0.7521 0.6339 0.6030 0.7209 0.1334 0.0 0.0988 nan 0.3603 0.4304 0.5592 0.0 0.7059 0.5504 0.5546 0.6319 0.1235 0.0 0.0846 nan 0.3103 0.3901 0.5391
0.1436 13.12 2100 0.3869 0.3156 0.3744 0.5828 nan 0.7025 0.4233 0.5510 0.6780 0.1886 0.0 0.0510 nan 0.2666 0.3543 0.5291 0.0 0.6640 0.3923 0.5214 0.5994 0.1688 0.0 0.0440 nan 0.2386 0.3359 0.5075
0.0907 13.75 2200 0.3739 0.3237 0.3853 0.6046 nan 0.7534 0.5218 0.6138 0.5515 0.2576 0.0 0.0377 nan 0.2211 0.3392 0.5574 0.0 0.7090 0.4937 0.5742 0.4980 0.2077 0.0 0.0343 nan 0.2079 0.3158 0.5206
0.147 14.38 2300 0.3751 0.3667 0.4460 0.6265 nan 0.6614 0.6418 0.5923 0.7208 0.2728 0.0 0.0884 nan 0.2801 0.5884 0.6142 0.0 0.6267 0.5779 0.5584 0.6181 0.2302 0.0 0.0739 nan 0.2553 0.5113 0.5816
0.0612 15.0 2400 0.3993 0.3152 0.3777 0.5802 nan 0.6818 0.5538 0.6054 0.5973 0.1225 0.0 0.0486 nan 0.3157 0.4393 0.4124 0.0 0.6439 0.5163 0.5435 0.5331 0.1134 0.0 0.0438 nan 0.2698 0.3983 0.4056
0.0854 15.62 2500 0.4168 0.3039 0.3621 0.5569 nan 0.6689 0.4421 0.5384 0.5871 0.1719 0.0 0.0233 nan 0.2696 0.3985 0.5212 0.0 0.6317 0.4223 0.4957 0.5199 0.1508 0.0 0.0213 nan 0.2457 0.3702 0.4855
0.0806 16.25 2600 0.4017 0.3460 0.4169 0.6201 nan 0.7083 0.6388 0.6258 0.5904 0.1749 0.0 0.1096 nan 0.2301 0.4998 0.5915 0.0 0.6659 0.5768 0.5736 0.5358 0.1552 0.0 0.0903 nan 0.2099 0.4466 0.5522
0.137 16.88 2700 0.4268 0.2834 0.3348 0.5474 nan 0.6984 0.4311 0.5213 0.5677 0.0688 0.0 0.0219 nan 0.1758 0.4672 0.3961 0.0 0.6564 0.4077 0.4842 0.5061 0.0638 0.0 0.0208 nan 0.1653 0.4276 0.3855
0.0375 17.5 2800 0.4117 0.2816 0.3339 0.5291 nan 0.6131 0.4906 0.6136 0.5158 0.0881 0.0 0.0292 nan 0.2010 0.3391 0.4484 0.0 0.5803 0.4398 0.5575 0.4677 0.0809 0.0 0.0272 nan 0.1899 0.3179 0.4369
0.0654 18.12 2900 0.4334 0.3470 0.4190 0.6392 nan 0.7536 0.6040 0.6625 0.6205 0.1722 0.0 0.1006 nan 0.3133 0.4067 0.5566 0.0 0.7052 0.5599 0.5923 0.5496 0.1515 0.0 0.0809 nan 0.2670 0.3782 0.5320
0.0759 18.75 3000 0.4226 0.3140 0.3770 0.5661 nan 0.6390 0.5546 0.6119 0.5289 0.1559 0.0 0.0158 nan 0.2478 0.4546 0.5611 0.0 0.6063 0.5000 0.5449 0.4774 0.1370 0.0 0.0148 nan 0.2238 0.4182 0.5319
0.1047 19.38 3100 0.4350 0.3058 0.3639 0.5608 nan 0.6803 0.5207 0.5750 0.5243 0.1947 0.0 0.0335 nan 0.2706 0.3744 0.4656 0.0 0.6424 0.4914 0.5193 0.4712 0.1700 0.0 0.0307 nan 0.2409 0.3498 0.4479
0.146 20.0 3200 0.4320 0.3138 0.3796 0.5634 nan 0.6526 0.4673 0.5958 0.5859 0.2021 0.0 0.0287 nan 0.2886 0.4822 0.4933 0.0 0.6188 0.4135 0.5438 0.5232 0.1710 0.0 0.0244 nan 0.2502 0.4367 0.4706
0.1012 20.62 3300 0.4231 0.3294 0.3967 0.5944 nan 0.6824 0.5358 0.6163 0.5851 0.1819 0.0 0.0243 nan 0.3027 0.4514 0.5866 0.0 0.6449 0.4899 0.5645 0.5214 0.1589 0.0 0.0213 nan 0.2605 0.4193 0.5423
0.1004 21.25 3400 0.4312 0.3369 0.4078 0.6181 nan 0.7167 0.5900 0.6539 0.5973 0.1753 0.0 0.0330 nan 0.2538 0.5161 0.5419 0.0 0.6767 0.5234 0.5867 0.5300 0.1515 0.0 0.0276 nan 0.2273 0.4649 0.5176
0.0837 21.88 3500 0.4385 0.3202 0.3844 0.5932 nan 0.6960 0.5322 0.6045 0.5847 0.1779 0.0 0.0238 nan 0.2458 0.3876 0.5910 0.0 0.6549 0.4828 0.5517 0.5181 0.1554 0.0 0.0210 nan 0.2195 0.3639 0.5549
0.1212 22.5 3600 0.4473 0.3209 0.3857 0.5969 nan 0.7202 0.5315 0.5947 0.5830 0.1908 0.0 0.0382 nan 0.2426 0.4183 0.5379 0.0 0.6752 0.4757 0.5356 0.5134 0.1673 0.0 0.0335 nan 0.2203 0.3885 0.5200
0.0698 23.12 3700 0.4587 0.3033 0.3629 0.5581 nan 0.6604 0.5113 0.5777 0.5497 0.1981 0.0 0.0128 nan 0.2450 0.3808 0.4930 0.0 0.6252 0.4590 0.5288 0.4903 0.1688 0.0 0.0121 nan 0.2188 0.3569 0.4760
0.1282 23.75 3800 0.4509 0.3262 0.3922 0.5981 nan 0.6936 0.5414 0.6098 0.6114 0.1966 0.0 0.0250 nan 0.2791 0.3966 0.5680 0.0 0.6536 0.4908 0.5585 0.5406 0.1666 0.0 0.0219 nan 0.2447 0.3730 0.5383
0.0473 24.38 3900 0.4496 0.3334 0.4008 0.6063 nan 0.7051 0.5613 0.6237 0.5998 0.1989 0.0 0.0330 nan 0.2805 0.4479 0.5579 0.0 0.6636 0.5091 0.5670 0.5331 0.1698 0.0 0.0286 nan 0.2470 0.4166 0.5329
0.069 25.0 4000 0.4442 0.3404 0.4109 0.6170 nan 0.7116 0.5806 0.6320 0.6083 0.2084 0.0 0.0391 nan 0.2758 0.4698 0.5837 0.0 0.6682 0.5204 0.5734 0.5400 0.1764 0.0 0.0335 nan 0.2434 0.4342 0.5547

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.2+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
47.2M params
Tensor type
F32
·

Finetuned from