Edit model card

beit-base-patch16-224-pt22k-ft22k-finetuned-plantorgans

This model is a fine-tuned version of microsoft/beit-base-patch16-224-pt22k-ft22k on the jpodivin/plantorgans dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2004
  • Mean Iou: 0.0260
  • Mean Accuracy: 0.0325
  • Overall Accuracy: 0.0435
  • Accuracy Void: nan
  • Accuracy Fruit: 0.1299
  • Accuracy Leaf: 0.0
  • Accuracy Flower: 0.0
  • Accuracy Stem: 0.0001
  • Iou Void: 0.0
  • Iou Fruit: 0.1299
  • Iou Leaf: 0.0
  • Iou Flower: 0.0
  • Iou Stem: 0.0001

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 10
  • eval_batch_size: 10
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 10.0

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Void Accuracy Fruit Accuracy Leaf Accuracy Flower Accuracy Stem Iou Void Iou Fruit Iou Leaf Iou Flower Iou Stem
1.0657 1.0 575 1.1178 0.0158 0.0197 0.0261 nan 0.0776 0.0 0.0 0.0012 0.0 0.0776 0.0 0.0 0.0012
0.5398 2.0 1150 1.0816 0.0361 0.0451 0.0599 nan 0.1783 0.0000 0.0 0.0020 0.0 0.1783 0.0000 0.0 0.0020
0.4794 3.0 1725 1.2122 0.0110 0.0138 0.0179 nan 0.0517 0.0003 0.0 0.0032 0.0 0.0517 0.0003 0.0 0.0032
0.446 4.0 2300 1.3138 0.0114 0.0142 0.0189 nan 0.0562 0.0 0.0 0.0006 0.0 0.0562 0.0 0.0 0.0006
0.4422 5.0 2875 1.2360 0.0005 0.0006 0.0006 nan 0.0012 0.0 0.0 0.0013 0.0 0.0012 0.0 0.0 0.0013
0.4183 6.0 3450 1.3598 0.0177 0.0221 0.0296 nan 0.0885 0.0 0.0 0.0 0.0 0.0885 0.0 0.0 0.0
0.3921 7.0 4025 1.2523 0.0267 0.0333 0.0446 nan 0.1333 0.0000 0.0 0.0 0.0 0.1333 0.0000 0.0 0.0
0.3743 8.0 4600 1.3146 0.0401 0.0502 0.0670 nan 0.2002 0.0 0.0 0.0005 0.0 0.2001 0.0 0.0 0.0005
0.3695 9.0 5175 1.2873 0.0294 0.0368 0.0492 nan 0.1472 0.0 0.0 0.0001 0.0 0.1471 0.0 0.0 0.0001
0.3796 10.0 5750 1.2004 0.0260 0.0325 0.0435 nan 0.1299 0.0 0.0 0.0001 0.0 0.1299 0.0 0.0 0.0001

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.1.2+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
8
Safetensors
Model size
162M params
Tensor type
F32
·

Finetuned from