jacquelinegrimm's picture
End of training
9596a26 verified
metadata
license: other
base_model: nvidia/mit-b0
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: segformer-b0-finetuned-arabidopsis-roots-multi
    results: []

segformer-b0-finetuned-arabidopsis-roots-multi

This model is a fine-tuned version of nvidia/mit-b0 on the jacquelinegrimm/arabidopsis-roots-multi dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0675
  • Mean Iou: 0.5272
  • Mean Accuracy: 0.7048
  • Overall Accuracy: 0.6874
  • Accuracy Background: nan
  • Accuracy Main root: 0.6670
  • Accuracy Lateral root: 0.6181
  • Accuracy Shoot: 0.7576
  • Accuracy Botrytis: 0.7765
  • Iou Background: 0.0
  • Iou Main root: 0.6256
  • Iou Lateral root: 0.5603
  • Iou Shoot: 0.6767
  • Iou Botrytis: 0.7734

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Main root Accuracy Lateral root Accuracy Shoot Accuracy Botrytis Iou Background Iou Main root Iou Lateral root Iou Shoot Iou Botrytis
1.1832 1.0 20 1.2347 0.3383 0.4952 0.4799 nan 0.1614 0.7764 0.1747 0.8683 0.0 0.1377 0.5355 0.1700 0.8484
0.901 2.0 40 0.7864 0.3051 0.4105 0.3438 nan 0.0062 0.3933 0.3166 0.9258 0.0 0.0061 0.3238 0.2853 0.9101
0.7161 3.0 60 0.6386 0.3059 0.4133 0.3430 nan 0.0271 0.3389 0.3349 0.9523 0.0 0.0270 0.2919 0.2847 0.9261
0.6358 4.0 80 0.6021 0.3155 0.4225 0.3502 nan 0.0390 0.3338 0.3713 0.9458 0.0 0.0387 0.2878 0.3241 0.9271
0.5111 5.0 100 0.4770 0.2871 0.3779 0.3175 nan 0.0567 0.2880 0.2798 0.8871 0.0 0.0562 0.2493 0.2571 0.8728
0.4842 6.0 120 0.4069 0.2422 0.3232 0.2792 nan 0.1142 0.2559 0.3734 0.5495 0.0 0.1112 0.2290 0.3240 0.5470
0.3631 7.0 140 0.3527 0.2692 0.3581 0.3537 nan 0.3382 0.3411 0.3353 0.4179 0.0 0.3168 0.3021 0.3093 0.4179
0.3316 8.0 160 0.2986 0.4239 0.5731 0.5496 nan 0.4712 0.5003 0.5306 0.7904 0.0 0.4372 0.4333 0.4719 0.7773
0.3074 9.0 180 0.2755 0.3755 0.5061 0.4932 nan 0.4440 0.4876 0.5202 0.5727 0.0 0.4129 0.4281 0.4679 0.5687
0.3216 10.0 200 0.2317 0.4262 0.5762 0.5625 nan 0.5676 0.4656 0.5853 0.6864 0.0 0.5153 0.4206 0.5142 0.6811
0.2426 11.0 220 0.2022 0.4659 0.6319 0.5981 nan 0.5068 0.5241 0.6682 0.8285 0.0 0.4766 0.4531 0.5856 0.8139
0.2063 12.0 240 0.1752 0.4274 0.5759 0.5627 nan 0.5756 0.4687 0.6275 0.6319 0.0 0.5259 0.4287 0.5547 0.6277
0.1811 13.0 260 0.1542 0.4608 0.6216 0.6022 nan 0.5509 0.5639 0.6604 0.7113 0.0 0.5166 0.4893 0.5899 0.7083
0.1738 14.0 280 0.1396 0.4393 0.5926 0.5781 nan 0.6143 0.4492 0.6755 0.6314 0.0 0.5544 0.4218 0.5914 0.6291
0.2094 15.0 300 0.1193 0.4422 0.5906 0.5683 nan 0.5321 0.4922 0.6475 0.6907 0.0 0.4953 0.4505 0.5760 0.6891
0.129 16.0 320 0.1200 0.4816 0.6520 0.6415 nan 0.6646 0.5644 0.7487 0.6303 0.0 0.6072 0.5158 0.6552 0.6298
0.1326 17.0 340 0.1109 0.4622 0.6200 0.6005 nan 0.5568 0.5628 0.7030 0.6576 0.0 0.5257 0.5029 0.6270 0.6555
0.0935 18.0 360 0.1043 0.5018 0.6743 0.6542 nan 0.5992 0.6207 0.7253 0.7519 0.0 0.5665 0.5412 0.6523 0.7490
0.0862 19.0 380 0.1044 0.4790 0.6435 0.6272 nan 0.5848 0.6080 0.7212 0.6600 0.0 0.5538 0.5333 0.6492 0.6586
0.122 20.0 400 0.0940 0.5244 0.7048 0.6846 nan 0.6428 0.6277 0.7498 0.7987 0.0 0.6000 0.5576 0.6708 0.7939
0.0886 21.0 420 0.0945 0.5390 0.7263 0.7016 nan 0.6253 0.6676 0.7686 0.8439 0.0 0.5912 0.5754 0.6902 0.8382
0.0742 22.0 440 0.0933 0.4919 0.6620 0.6407 nan 0.6001 0.5945 0.7732 0.6803 0.0 0.5677 0.5301 0.6829 0.6788
0.0727 23.0 460 0.0886 0.4711 0.6279 0.6139 nan 0.6072 0.5489 0.6892 0.6664 0.0 0.5695 0.5005 0.6199 0.6656
0.0708 24.0 480 0.0854 0.4966 0.6684 0.6574 nan 0.6409 0.6351 0.7520 0.6454 0.0 0.6040 0.5612 0.6737 0.6443
0.0614 25.0 500 0.0857 0.4882 0.6563 0.6405 nan 0.6005 0.6244 0.7464 0.6541 0.0 0.5691 0.5524 0.6658 0.6535
0.151 26.0 520 0.0834 0.5106 0.6852 0.6667 nan 0.6271 0.6253 0.7566 0.7320 0.0 0.5918 0.5566 0.6762 0.7284
0.0658 27.0 540 0.0823 0.5055 0.6768 0.6636 nan 0.6741 0.5807 0.7498 0.7027 0.0 0.6250 0.5332 0.6673 0.7017
0.0643 28.0 560 0.0797 0.4959 0.6636 0.6481 nan 0.6315 0.5930 0.7399 0.6901 0.0 0.5921 0.5366 0.6619 0.6889
0.059 29.0 580 0.0782 0.5115 0.6845 0.6682 nan 0.6490 0.6092 0.7531 0.7265 0.0 0.6100 0.5472 0.6769 0.7234
0.1726 30.0 600 0.0786 0.5235 0.7039 0.6846 nan 0.6190 0.6734 0.7552 0.7679 0.0 0.5905 0.5825 0.6807 0.7639
0.0546 31.0 620 0.0745 0.5116 0.6843 0.6654 nan 0.6337 0.6072 0.7468 0.7494 0.0 0.5956 0.5467 0.6701 0.7457
0.1096 32.0 640 0.0746 0.5093 0.6814 0.6590 nan 0.5992 0.6202 0.7427 0.7634 0.0 0.5666 0.5550 0.6656 0.7595
0.0552 33.0 660 0.0750 0.5358 0.7160 0.6944 nan 0.6520 0.6285 0.7572 0.8262 0.0 0.6147 0.5639 0.6783 0.8223
0.0557 34.0 680 0.0731 0.5123 0.6878 0.6709 nan 0.6271 0.6496 0.7669 0.7076 0.0 0.5955 0.5745 0.6853 0.7060
0.0516 35.0 700 0.0733 0.5307 0.7108 0.6929 nan 0.6696 0.6260 0.7665 0.7813 0.0 0.6276 0.5654 0.6825 0.7781
0.105 36.0 720 0.0717 0.5242 0.7020 0.6823 nan 0.6365 0.6397 0.7633 0.7684 0.0 0.6029 0.5682 0.6841 0.7660
0.1305 37.0 740 0.0713 0.5232 0.7002 0.6845 nan 0.6739 0.6137 0.7624 0.7510 0.0 0.6289 0.5585 0.6794 0.7490
0.0479 38.0 760 0.0707 0.5196 0.6944 0.6763 nan 0.6517 0.6116 0.7556 0.7588 0.0 0.6125 0.5532 0.6754 0.7567
0.0552 39.0 780 0.0705 0.5198 0.6977 0.6837 nan 0.6561 0.6543 0.7664 0.7139 0.0 0.6209 0.5803 0.6852 0.7124
0.0997 40.0 800 0.0699 0.5219 0.6968 0.6789 nan 0.6586 0.6057 0.7478 0.7752 0.0 0.6180 0.5504 0.6680 0.7729
0.0774 41.0 820 0.0708 0.5171 0.6900 0.6727 nan 0.6592 0.5927 0.7395 0.7686 0.0 0.6163 0.5402 0.6635 0.7656
0.0899 42.0 840 0.0692 0.5324 0.7125 0.6927 nan 0.6683 0.6169 0.7765 0.7883 0.0 0.6265 0.5601 0.6902 0.7851
0.0492 43.0 860 0.0682 0.5390 0.7216 0.7043 nan 0.6740 0.6497 0.7676 0.7950 0.0 0.6348 0.5793 0.6891 0.7918
0.0712 44.0 880 0.0690 0.5121 0.6844 0.6692 nan 0.6570 0.6071 0.7533 0.7204 0.0 0.6153 0.5524 0.6743 0.7186
0.1034 45.0 900 0.0685 0.5503 0.7379 0.7191 nan 0.6822 0.6645 0.7832 0.8215 0.0 0.6439 0.5905 0.6994 0.8175
0.0478 46.0 920 0.0681 0.5365 0.7179 0.6998 nan 0.6726 0.6369 0.7719 0.7902 0.0 0.6326 0.5728 0.6901 0.7869
0.0452 47.0 940 0.0682 0.5341 0.7157 0.6993 nan 0.6723 0.6495 0.7765 0.7647 0.0 0.6346 0.5803 0.6935 0.7621
0.0542 48.0 960 0.0675 0.5382 0.7206 0.7021 nan 0.6695 0.6444 0.7726 0.7961 0.0 0.6313 0.5772 0.6899 0.7928
0.0738 49.0 980 0.0680 0.5360 0.7168 0.6977 nan 0.6714 0.6254 0.7666 0.8040 0.0 0.6302 0.5658 0.6838 0.8004
0.1169 50.0 1000 0.0675 0.5272 0.7048 0.6874 nan 0.6670 0.6181 0.7576 0.7765 0.0 0.6256 0.5603 0.6767 0.7734

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2