Edit model card

segformer-finetuned-biofilm2

This model is a fine-tuned version of nvidia/mit-b0 on the heroza/biofilm2 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0797
  • Mean Iou: 0.4786
  • Mean Accuracy: 0.9572
  • Overall Accuracy: 0.9572
  • Accuracy Background: nan
  • Accuracy Biofilm: 0.9572
  • Iou Background: 0.0
  • Iou Biofilm: 0.9572

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: polynomial
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Biofilm Iou Background Iou Biofilm
0.1622 1.0 280 0.1158 0.4714 0.9428 0.9428 nan 0.9428 0.0 0.9428
0.0742 2.0 560 0.0643 0.4545 0.9090 0.9090 nan 0.9090 0.0 0.9090
0.0549 3.0 840 0.0582 0.4797 0.9594 0.9594 nan 0.9594 0.0 0.9594
0.0459 4.0 1120 0.0508 0.4737 0.9475 0.9475 nan 0.9475 0.0 0.9475
0.0506 5.0 1400 0.0405 0.4705 0.9411 0.9411 nan 0.9411 0.0 0.9411
0.0411 6.0 1680 0.0476 0.4865 0.9729 0.9729 nan 0.9729 0.0 0.9729
0.0456 7.0 1960 0.0476 0.4754 0.9509 0.9509 nan 0.9509 0.0 0.9509
0.0381 8.0 2240 0.0554 0.4792 0.9584 0.9584 nan 0.9584 0.0 0.9584
0.0348 9.0 2520 0.0559 0.4889 0.9779 0.9779 nan 0.9779 0.0 0.9779
0.0388 10.0 2800 0.0513 0.4757 0.9514 0.9514 nan 0.9514 0.0 0.9514
0.0385 11.0 3080 0.0660 0.4883 0.9767 0.9767 nan 0.9767 0.0 0.9767
0.0309 12.0 3360 0.0589 0.4808 0.9616 0.9616 nan 0.9616 0.0 0.9616
0.0322 13.0 3640 0.0539 0.4796 0.9592 0.9592 nan 0.9592 0.0 0.9592
0.0361 14.0 3920 0.0621 0.4812 0.9625 0.9625 nan 0.9625 0.0 0.9625
0.0277 15.0 4200 0.0576 0.4836 0.9672 0.9672 nan 0.9672 0.0 0.9672
0.0324 16.0 4480 0.0503 0.4702 0.9404 0.9404 nan 0.9404 0.0 0.9404
0.0355 17.0 4760 0.0583 0.4801 0.9601 0.9601 nan 0.9601 0.0 0.9601
0.032 18.0 5040 0.0528 0.4679 0.9358 0.9358 nan 0.9358 0.0 0.9358
0.0275 19.0 5320 0.0682 0.4828 0.9656 0.9656 nan 0.9656 0.0 0.9656
0.0329 20.0 5600 0.0712 0.4796 0.9591 0.9591 nan 0.9591 0.0 0.9591
0.0284 21.0 5880 0.0769 0.4868 0.9737 0.9737 nan 0.9737 0.0 0.9737
0.028 22.0 6160 0.0615 0.4826 0.9651 0.9651 nan 0.9651 0.0 0.9651
0.0275 23.0 6440 0.0640 0.4797 0.9595 0.9595 nan 0.9595 0.0 0.9595
0.0263 24.0 6720 0.0805 0.4819 0.9639 0.9639 nan 0.9639 0.0 0.9639
0.0252 25.0 7000 0.0700 0.4830 0.9661 0.9661 nan 0.9661 0.0 0.9661
0.0309 26.0 7280 0.0747 0.4854 0.9709 0.9709 nan 0.9709 0.0 0.9709
0.0238 27.0 7560 0.0704 0.4814 0.9628 0.9628 nan 0.9628 0.0 0.9628
0.0277 28.0 7840 0.0757 0.4858 0.9716 0.9716 nan 0.9716 0.0 0.9716
0.0281 29.0 8120 0.0847 0.4830 0.9661 0.9661 nan 0.9661 0.0 0.9661
0.0259 30.0 8400 0.0741 0.4820 0.9640 0.9640 nan 0.9640 0.0 0.9640
0.0231 31.0 8680 0.0726 0.4794 0.9587 0.9587 nan 0.9587 0.0 0.9587
0.0234 32.0 8960 0.0739 0.4779 0.9557 0.9557 nan 0.9557 0.0 0.9557
0.0226 33.0 9240 0.0743 0.4806 0.9613 0.9613 nan 0.9613 0.0 0.9613
0.0242 34.0 9520 0.0776 0.4792 0.9584 0.9584 nan 0.9584 0.0 0.9584
0.0211 35.0 9800 0.0775 0.4765 0.9529 0.9529 nan 0.9529 0.0 0.9529
0.0223 35.71 10000 0.0797 0.4786 0.9572 0.9572 nan 0.9572 0.0 0.9572

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.1.0
  • Datasets 2.14.4
  • Tokenizers 0.15.1
Downloads last month
3
Safetensors
Model size
3.72M params
Tensor type
F32
·

Finetuned from