uttam333's picture
End of training
3cb948e
|
raw
history blame
7.28 kB
metadata
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
model-index:
  - name: layoutlm-custom_no_text
    results: []

layoutlm-custom_no_text

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3594
  • Noise: {'precision': 0.5976627712854758, 'recall': 0.5700636942675159, 'f1': 0.5835370823145885, 'number': 628}
  • Signal: {'precision': 0.5559265442404007, 'recall': 0.5302547770700637, 'f1': 0.5427872860635697, 'number': 628}
  • Overall Precision: 0.5768
  • Overall Recall: 0.5502
  • Overall F1: 0.5632
  • Overall Accuracy: 0.8777

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Noise Signal Overall Precision Overall Recall Overall F1 Overall Accuracy
0.5172 1.0 18 0.4915 {'precision': 0.3973288814691152, 'recall': 0.37898089171974525, 'f1': 0.3879380603096985, 'number': 628} {'precision': 0.36894824707846413, 'recall': 0.3519108280254777, 'f1': 0.3602281988590057, 'number': 628} 0.3831 0.3654 0.3741 0.7779
0.4057 2.0 36 0.4306 {'precision': 0.42788461538461536, 'recall': 0.4251592356687898, 'f1': 0.426517571884984, 'number': 628} {'precision': 0.3766025641025641, 'recall': 0.37420382165605093, 'f1': 0.3753993610223642, 'number': 628} 0.4022 0.3997 0.4010 0.8151
0.3616 3.0 54 0.4145 {'precision': 0.444633730834753, 'recall': 0.4156050955414013, 'f1': 0.4296296296296297, 'number': 628} {'precision': 0.41056218057921634, 'recall': 0.3837579617834395, 'f1': 0.3967078189300412, 'number': 628} 0.4276 0.3997 0.4132 0.8237
0.3278 4.0 72 0.3994 {'precision': 0.5050167224080268, 'recall': 0.48089171974522293, 'f1': 0.4926590538336052, 'number': 628} {'precision': 0.4698996655518395, 'recall': 0.44745222929936307, 'f1': 0.4584013050570963, 'number': 628} 0.4875 0.4642 0.4755 0.8366
0.2966 5.0 90 0.3795 {'precision': 0.5129533678756477, 'recall': 0.4729299363057325, 'f1': 0.49212924606462305, 'number': 628} {'precision': 0.5043177892918825, 'recall': 0.46496815286624205, 'f1': 0.48384424192212094, 'number': 628} 0.5086 0.4689 0.4880 0.8489
0.2717 6.0 108 0.3526 {'precision': 0.5459272097053726, 'recall': 0.5015923566878981, 'f1': 0.5228215767634854, 'number': 628} {'precision': 0.511265164644714, 'recall': 0.4697452229299363, 'f1': 0.48962655601659755, 'number': 628} 0.5286 0.4857 0.5062 0.8581
0.2441 7.0 126 0.3400 {'precision': 0.5338208409506399, 'recall': 0.46496815286624205, 'f1': 0.4970212765957447, 'number': 628} {'precision': 0.49725776965265084, 'recall': 0.43312101910828027, 'f1': 0.4629787234042553, 'number': 628} 0.5155 0.4490 0.4800 0.8598
0.224 8.0 144 0.3324 {'precision': 0.563922942206655, 'recall': 0.5127388535031847, 'f1': 0.5371142618849041, 'number': 628} {'precision': 0.5288966725043783, 'recall': 0.48089171974522293, 'f1': 0.5037531276063387, 'number': 628} 0.5464 0.4968 0.5204 0.8673
0.2044 9.0 162 0.3249 {'precision': 0.5833333333333334, 'recall': 0.535031847133758, 'f1': 0.558139534883721, 'number': 628} {'precision': 0.5347222222222222, 'recall': 0.49044585987261147, 'f1': 0.5116279069767441, 'number': 628} 0.5590 0.5127 0.5349 0.8726
0.1914 10.0 180 0.3481 {'precision': 0.5597920277296361, 'recall': 0.5143312101910829, 'f1': 0.5360995850622408, 'number': 628} {'precision': 0.511265164644714, 'recall': 0.4697452229299363, 'f1': 0.48962655601659755, 'number': 628} 0.5355 0.4920 0.5129 0.8645
0.1823 11.0 198 0.3412 {'precision': 0.5963756177924218, 'recall': 0.5764331210191083, 'f1': 0.5862348178137652, 'number': 628} {'precision': 0.5667215815485996, 'recall': 0.5477707006369427, 'f1': 0.557085020242915, 'number': 628} 0.5815 0.5621 0.5717 0.8810
0.1672 12.0 216 0.3496 {'precision': 0.5791245791245792, 'recall': 0.5477707006369427, 'f1': 0.563011456628478, 'number': 628} {'precision': 0.5420875420875421, 'recall': 0.5127388535031847, 'f1': 0.5270049099836335, 'number': 628} 0.5606 0.5303 0.5450 0.8735
0.1627 13.0 234 0.3675 {'precision': 0.5953565505804311, 'recall': 0.571656050955414, 'f1': 0.5832656376929325, 'number': 628} {'precision': 0.5621890547263682, 'recall': 0.5398089171974523, 'f1': 0.5507717303005686, 'number': 628} 0.5788 0.5557 0.5670 0.8779
0.1592 14.0 252 0.3562 {'precision': 0.596694214876033, 'recall': 0.5748407643312102, 'f1': 0.5855636658556367, 'number': 628} {'precision': 0.5504132231404959, 'recall': 0.5302547770700637, 'f1': 0.5401459854014599, 'number': 628} 0.5736 0.5525 0.5629 0.8757
0.1553 15.0 270 0.3594 {'precision': 0.5976627712854758, 'recall': 0.5700636942675159, 'f1': 0.5835370823145885, 'number': 628} {'precision': 0.5559265442404007, 'recall': 0.5302547770700637, 'f1': 0.5427872860635697, 'number': 628} 0.5768 0.5502 0.5632 0.8777

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0