kacper-cierzniewski's picture
End of training
0eaf274 verified
metadata
license: apache-2.0
base_model: kacper-cierzniewski/daigram_detr_r50_albumentations
tags:
  - generated_from_trainer
datasets:
  - bpmn-shapes
model-index:
  - name: daigram_detr_r50_albumentations_finetuning
    results: []

daigram_detr_r50_albumentations_finetuning

This model is a fine-tuned version of kacper-cierzniewski/daigram_detr_r50_albumentations on the bpmn-shapes dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9817

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 48
  • eval_batch_size: 48
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.9457 12.5 50 1.0238
0.9717 25.0 100 1.0411
0.9823 37.5 150 1.0269
0.9524 50.0 200 1.0518
0.9886 62.5 250 1.0548
0.9638 75.0 300 1.0454
0.948 87.5 350 1.0240
0.9312 100.0 400 1.0281
0.9183 112.5 450 1.0112
0.9219 125.0 500 1.0110
0.9285 137.5 550 1.0325
0.9177 150.0 600 1.0009
0.9323 162.5 650 1.0124
0.9333 175.0 700 1.0154
0.9386 187.5 750 1.0188
0.9586 200.0 800 0.9978
0.894 212.5 850 1.0087
0.8999 225.0 900 1.0055
0.9324 237.5 950 1.0185
0.9313 250.0 1000 0.9840
0.9177 262.5 1050 0.9785
0.8918 275.0 1100 0.9874
0.9145 287.5 1150 0.9802
0.89 300.0 1200 0.9879
0.8818 312.5 1250 0.9857
0.9256 325.0 1300 0.9951
0.9028 337.5 1350 1.0001
0.9252 350.0 1400 1.0033
0.9017 362.5 1450 0.9916
0.8783 375.0 1500 0.9858
0.911 387.5 1550 0.9758
0.8797 400.0 1600 0.9810
0.8995 412.5 1650 0.9840
0.8781 425.0 1700 0.9843
0.8897 437.5 1750 0.9745
0.905 450.0 1800 0.9825
0.8961 462.5 1850 0.9781
0.8865 475.0 1900 0.9781
0.8824 487.5 1950 0.9794
0.8836 500.0 2000 0.9817

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0