Edit model card

segformer-b0-finetuned-lipid-droplets-v2

This model is a fine-tuned version of nvidia/mit-b0 on the jhaberbe/lipid-droplets-v3 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1451
  • Mean Iou: 0.3767
  • Mean Accuracy: 0.7533
  • Overall Accuracy: 0.7533
  • Accuracy Unlabeled: nan
  • Accuracy Lipid: 0.7533
  • Iou Unlabeled: 0.0
  • Iou Lipid: 0.7533

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Lipid Iou Unlabeled Iou Lipid
0.5878 1.82 20 0.6215 0.2190 0.4381 0.4381 nan 0.4381 0.0 0.4381
0.4506 3.64 40 0.4553 0.2542 0.5084 0.5084 nan 0.5084 0.0 0.5084
0.3435 5.45 60 0.3476 0.2703 0.5405 0.5405 nan 0.5405 0.0 0.5405
0.2931 7.27 80 0.2191 0.2148 0.4296 0.4296 nan 0.4296 0.0 0.4296
0.2361 9.09 100 0.2199 0.2829 0.5659 0.5659 nan 0.5659 0.0 0.5659
0.2739 10.91 120 0.2493 0.2668 0.5336 0.5336 nan 0.5336 0.0 0.5336
0.1993 12.73 140 0.2123 0.3617 0.7234 0.7234 nan 0.7234 0.0 0.7234
0.1226 14.55 160 0.0776 0.1070 0.2140 0.2140 nan 0.2140 0.0 0.2140
0.1348 16.36 180 0.1453 0.3343 0.6686 0.6686 nan 0.6686 0.0 0.6686
0.0939 18.18 200 0.1973 0.3632 0.7265 0.7265 nan 0.7265 0.0 0.7265
0.1426 20.0 220 0.1698 0.2628 0.5256 0.5256 nan 0.5256 0.0 0.5256
0.0722 21.82 240 0.1061 0.3520 0.7040 0.7040 nan 0.7040 0.0 0.7040
0.0764 23.64 260 0.1653 0.2990 0.5979 0.5979 nan 0.5979 0.0 0.5979
0.0551 25.45 280 0.1072 0.3103 0.6206 0.6206 nan 0.6206 0.0 0.6206
0.0437 27.27 300 0.2012 0.3494 0.6988 0.6988 nan 0.6988 0.0 0.6988
0.0514 29.09 320 0.1825 0.3777 0.7553 0.7553 nan 0.7553 0.0 0.7553
0.1095 30.91 340 0.0897 0.3392 0.6785 0.6785 nan 0.6785 0.0 0.6785
0.0682 32.73 360 0.1785 0.3504 0.7008 0.7008 nan 0.7008 0.0 0.7008
0.0422 34.55 380 0.1167 0.3444 0.6887 0.6887 nan 0.6887 0.0 0.6887
0.0538 36.36 400 0.2332 0.4529 0.9057 0.9057 nan 0.9057 0.0 0.9057
0.0347 38.18 420 0.1698 0.3115 0.6231 0.6231 nan 0.6231 0.0 0.6231
0.0496 40.0 440 0.1201 0.3278 0.6555 0.6555 nan 0.6555 0.0 0.6555
0.0681 41.82 460 0.1830 0.3916 0.7831 0.7831 nan 0.7831 0.0 0.7831
0.0498 43.64 480 0.1848 0.4086 0.8172 0.8172 nan 0.8172 0.0 0.8172
0.0365 45.45 500 0.1234 0.3741 0.7481 0.7481 nan 0.7481 0.0 0.7481
0.0258 47.27 520 0.2000 0.4300 0.8599 0.8599 nan 0.8599 0.0 0.8599
0.0355 49.09 540 0.1273 0.3907 0.7814 0.7814 nan 0.7814 0.0 0.7814
0.0476 50.91 560 0.1827 0.4322 0.8644 0.8644 nan 0.8644 0.0 0.8644
0.0472 52.73 580 0.1014 0.3420 0.6839 0.6839 nan 0.6839 0.0 0.6839
0.0467 54.55 600 0.1330 0.3758 0.7516 0.7516 nan 0.7516 0.0 0.7516
0.0802 56.36 620 0.0698 0.3073 0.6147 0.6147 nan 0.6147 0.0 0.6147
0.0558 58.18 640 0.1291 0.4088 0.8176 0.8176 nan 0.8176 0.0 0.8176
0.0443 60.0 660 0.1097 0.4060 0.8119 0.8119 nan 0.8119 0.0 0.8119
0.0442 61.82 680 0.1112 0.3944 0.7887 0.7887 nan 0.7887 0.0 0.7887
0.0422 63.64 700 0.1837 0.4288 0.8576 0.8576 nan 0.8576 0.0 0.8576
0.0216 65.45 720 0.1140 0.3735 0.7470 0.7470 nan 0.7470 0.0 0.7470
0.0414 67.27 740 0.1017 0.3821 0.7643 0.7643 nan 0.7643 0.0 0.7643
0.0336 69.09 760 0.1458 0.3685 0.7370 0.7370 nan 0.7370 0.0 0.7370
0.0575 70.91 780 0.1392 0.3425 0.6851 0.6851 nan 0.6851 0.0 0.6851
0.0324 72.73 800 0.1162 0.3689 0.7377 0.7377 nan 0.7377 0.0 0.7377
0.0336 74.55 820 0.1366 0.4143 0.8287 0.8287 nan 0.8287 0.0 0.8287
0.0889 76.36 840 0.1604 0.3726 0.7452 0.7452 nan 0.7452 0.0 0.7452
0.0438 78.18 860 0.1528 0.3948 0.7895 0.7895 nan 0.7895 0.0 0.7895
0.0194 80.0 880 0.1360 0.3335 0.6671 0.6671 nan 0.6671 0.0 0.6671
0.0311 81.82 900 0.1832 0.4278 0.8555 0.8555 nan 0.8555 0.0 0.8555
0.0478 83.64 920 0.0718 0.2656 0.5312 0.5312 nan 0.5312 0.0 0.5312
0.0363 85.45 940 0.1281 0.3630 0.7259 0.7259 nan 0.7259 0.0 0.7259
0.0354 87.27 960 0.0986 0.3891 0.7782 0.7782 nan 0.7782 0.0 0.7782
0.0195 89.09 980 0.1582 0.4123 0.8247 0.8247 nan 0.8247 0.0 0.8247
0.0432 90.91 1000 0.1086 0.3217 0.6433 0.6433 nan 0.6433 0.0 0.6433
0.0288 92.73 1020 0.1669 0.4089 0.8179 0.8179 nan 0.8179 0.0 0.8179
0.0326 94.55 1040 0.1335 0.3908 0.7816 0.7816 nan 0.7816 0.0 0.7816
0.0384 96.36 1060 0.1415 0.3947 0.7895 0.7895 nan 0.7895 0.0 0.7895
0.0399 98.18 1080 0.1784 0.4204 0.8408 0.8408 nan 0.8408 0.0 0.8408
0.0214 100.0 1100 0.1451 0.3767 0.7533 0.7533 nan 0.7533 0.0 0.7533

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
54
Safetensors
Model size
3.72M params
Tensor type
F32
·

Finetuned from