Edit model card

segformer-b0-finetuned-arabidopsis-roots-v02

This model is a fine-tuned version of nvidia/mit-b0 on the jacquelinegrimm/arabidopsis-roots-v02 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0500
  • Mean Iou: 0.4449
  • Mean Accuracy: 0.8899
  • Overall Accuracy: 0.8899
  • Accuracy Background: nan
  • Accuracy Seedling: 0.8899
  • Iou Background: 0.0
  • Iou Seedling: 0.8899

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Seedling Iou Background Iou Seedling
0.4792 1.0 20 0.5681 0.4482 0.8964 0.8964 nan 0.8964 0.0 0.8964
0.3118 2.0 40 0.3829 0.4641 0.9282 0.9282 nan 0.9282 0.0 0.9282
0.2481 3.0 60 0.2563 0.4722 0.9445 0.9445 nan 0.9445 0.0 0.9445
0.2159 4.0 80 0.2402 0.4695 0.9389 0.9389 nan 0.9389 0.0 0.9389
0.1721 5.0 100 0.1644 0.4627 0.9254 0.9254 nan 0.9254 0.0 0.9254
0.1494 6.0 120 0.1407 0.4595 0.9189 0.9189 nan 0.9189 0.0 0.9189
0.1286 7.0 140 0.1213 0.4616 0.9233 0.9233 nan 0.9233 0.0 0.9233
0.1233 8.0 160 0.1097 0.4469 0.8939 0.8939 nan 0.8939 0.0 0.8939
0.108 9.0 180 0.0963 0.4419 0.8837 0.8837 nan 0.8837 0.0 0.8837
0.0968 10.0 200 0.0998 0.4491 0.8983 0.8983 nan 0.8983 0.0 0.8983
0.1041 11.0 220 0.0849 0.4354 0.8708 0.8708 nan 0.8708 0.0 0.8708
0.0842 12.0 240 0.0795 0.4444 0.8888 0.8888 nan 0.8888 0.0 0.8888
0.0896 13.0 260 0.0756 0.4402 0.8803 0.8803 nan 0.8803 0.0 0.8803
0.077 14.0 280 0.0797 0.4484 0.8968 0.8968 nan 0.8968 0.0 0.8968
0.0767 15.0 300 0.0690 0.4389 0.8779 0.8779 nan 0.8779 0.0 0.8779
0.0776 16.0 320 0.0665 0.4355 0.8710 0.8710 nan 0.8710 0.0 0.8710
0.0789 17.0 340 0.0680 0.4450 0.8901 0.8901 nan 0.8901 0.0 0.8901
0.071 18.0 360 0.0660 0.4424 0.8848 0.8848 nan 0.8848 0.0 0.8848
0.0661 19.0 380 0.0677 0.4371 0.8742 0.8742 nan 0.8742 0.0 0.8742
0.0784 20.0 400 0.0639 0.4402 0.8805 0.8805 nan 0.8805 0.0 0.8805
0.0789 21.0 420 0.0602 0.4369 0.8737 0.8737 nan 0.8737 0.0 0.8737
0.0756 22.0 440 0.0595 0.4313 0.8627 0.8627 nan 0.8627 0.0 0.8627
0.0563 23.0 460 0.0581 0.4420 0.8839 0.8839 nan 0.8839 0.0 0.8839
0.0713 24.0 480 0.0570 0.4438 0.8875 0.8875 nan 0.8875 0.0 0.8875
0.0594 25.0 500 0.0579 0.4478 0.8955 0.8955 nan 0.8955 0.0 0.8955
0.0636 26.0 520 0.0582 0.4523 0.9046 0.9046 nan 0.9046 0.0 0.9046
0.053 27.0 540 0.0558 0.4431 0.8863 0.8863 nan 0.8863 0.0 0.8863
0.0688 28.0 560 0.0561 0.4470 0.8939 0.8939 nan 0.8939 0.0 0.8939
0.0564 29.0 580 0.0542 0.4442 0.8885 0.8885 nan 0.8885 0.0 0.8885
0.0671 30.0 600 0.0537 0.4445 0.8891 0.8891 nan 0.8891 0.0 0.8891
0.0699 31.0 620 0.0530 0.4397 0.8794 0.8794 nan 0.8794 0.0 0.8794
0.0648 32.0 640 0.0535 0.4423 0.8845 0.8845 nan 0.8845 0.0 0.8845
0.058 33.0 660 0.0535 0.4467 0.8934 0.8934 nan 0.8934 0.0 0.8934
0.0582 34.0 680 0.0536 0.4513 0.9025 0.9025 nan 0.9025 0.0 0.9025
0.0561 35.0 700 0.0525 0.4444 0.8888 0.8888 nan 0.8888 0.0 0.8888
0.0731 36.0 720 0.0520 0.4439 0.8878 0.8878 nan 0.8878 0.0 0.8878
0.0503 37.0 740 0.0528 0.4484 0.8968 0.8968 nan 0.8968 0.0 0.8968
0.0895 38.0 760 0.0505 0.4388 0.8777 0.8777 nan 0.8777 0.0 0.8777
0.064 39.0 780 0.0508 0.4418 0.8836 0.8836 nan 0.8836 0.0 0.8836
0.0481 40.0 800 0.0512 0.4474 0.8949 0.8949 nan 0.8949 0.0 0.8949
0.0497 41.0 820 0.0533 0.4503 0.9006 0.9006 nan 0.9006 0.0 0.9006
0.06 42.0 840 0.0511 0.4447 0.8893 0.8893 nan 0.8893 0.0 0.8893
0.0543 43.0 860 0.0504 0.4441 0.8882 0.8882 nan 0.8882 0.0 0.8882
0.0539 44.0 880 0.0515 0.4421 0.8843 0.8843 nan 0.8843 0.0 0.8843
0.0461 45.0 900 0.0506 0.4461 0.8921 0.8921 nan 0.8921 0.0 0.8921
0.0554 46.0 920 0.0501 0.4453 0.8907 0.8907 nan 0.8907 0.0 0.8907
0.0475 47.0 940 0.0503 0.4471 0.8942 0.8942 nan 0.8942 0.0 0.8942
0.0645 48.0 960 0.0506 0.4463 0.8926 0.8926 nan 0.8926 0.0 0.8926
0.045 49.0 980 0.0510 0.4471 0.8942 0.8942 nan 0.8942 0.0 0.8942
0.0576 50.0 1000 0.0500 0.4449 0.8899 0.8899 nan 0.8899 0.0 0.8899

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.0.0
  • Datasets 2.1.0
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
3.72M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from