Edit model card

segformer-b0-finetuned-segments-stamp-verification

This model is a fine-tuned version of nvidia/mit-b0 on the AliShah07/stamp-verification dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0535
  • Mean Iou: 0.1317
  • Mean Accuracy: 0.2635
  • Overall Accuracy: 0.2635
  • Accuracy Unlabeled: nan
  • Accuracy Stamp: 0.2635
  • Iou Unlabeled: 0.0
  • Iou Stamp: 0.2635

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Stamp Iou Unlabeled Iou Stamp
0.6502 0.8333 20 0.6958 0.4685 0.9370 0.9370 nan 0.9370 0.0 0.9370
0.4529 1.6667 40 0.5458 0.0754 0.1508 0.1508 nan 0.1508 0.0 0.1508
0.3716 2.5 60 0.3818 0.0021 0.0041 0.0041 nan 0.0041 0.0 0.0041
0.3238 3.3333 80 0.2932 0.0126 0.0252 0.0252 nan 0.0252 0.0 0.0252
0.2167 4.1667 100 0.2326 0.0008 0.0015 0.0015 nan 0.0015 0.0 0.0015
0.1948 5.0 120 0.2029 0.0033 0.0065 0.0065 nan 0.0065 0.0 0.0065
0.1643 5.8333 140 0.1609 0.0 0.0 0.0 nan 0.0 0.0 0.0
0.1642 6.6667 160 0.1428 0.0 0.0 0.0 nan 0.0 0.0 0.0
0.1326 7.5 180 0.1222 0.0001 0.0002 0.0002 nan 0.0002 0.0 0.0002
0.1012 8.3333 200 0.0981 0.0 0.0 0.0 nan 0.0 0.0 0.0
0.0981 9.1667 220 0.0972 0.0058 0.0117 0.0117 nan 0.0117 0.0 0.0117
0.0838 10.0 240 0.0781 0.0015 0.0031 0.0031 nan 0.0031 0.0 0.0031
0.0771 10.8333 260 0.0708 0.0060 0.0120 0.0120 nan 0.0120 0.0 0.0120
0.0743 11.6667 280 0.0696 0.0298 0.0596 0.0596 nan 0.0596 0.0 0.0596
0.0655 12.5 300 0.0630 0.0398 0.0795 0.0795 nan 0.0795 0.0 0.0795
0.0673 13.3333 320 0.0613 0.0856 0.1712 0.1712 nan 0.1712 0.0 0.1712
0.0573 14.1667 340 0.0538 0.0725 0.1450 0.1450 nan 0.1450 0.0 0.1450
0.0623 15.0 360 0.0543 0.1008 0.2016 0.2016 nan 0.2016 0.0 0.2016
0.0557 15.8333 380 0.0559 0.1474 0.2947 0.2947 nan 0.2947 0.0 0.2947
0.0594 16.6667 400 0.0492 0.1019 0.2039 0.2039 nan 0.2039 0.0 0.2039
0.056 17.5 420 0.0479 0.1235 0.2470 0.2470 nan 0.2470 0.0 0.2470
0.0499 18.3333 440 0.0481 0.1124 0.2248 0.2248 nan 0.2248 0.0 0.2248
0.0516 19.1667 460 0.0477 0.1465 0.2930 0.2930 nan 0.2930 0.0 0.2930
0.0517 20.0 480 0.0535 0.1317 0.2635 0.2635 nan 0.2635 0.0 0.2635

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
3.72M params
Tensor type
F32
·

Finetuned from