layout-lm / README.md
clementWizard's picture
End of training
a421d34 verified
metadata
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layout-lm
    results: []

layout-lm

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6748
  • Answer: {'precision': 0.7245575221238938, 'recall': 0.8096415327564895, 'f1': 0.7647402218330415, 'number': 809}
  • Header: {'precision': 0.3464566929133858, 'recall': 0.3697478991596639, 'f1': 0.35772357723577236, 'number': 119}
  • Question: {'precision': 0.7756183745583038, 'recall': 0.8244131455399061, 'f1': 0.7992717341829768, 'number': 1065}
  • Overall Precision: 0.7291
  • Overall Recall: 0.7913
  • Overall F1: 0.7589
  • Overall Accuracy: 0.8136

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7656 1.0 10 1.5482 {'precision': 0.030390738060781478, 'recall': 0.02595797280593325, 'f1': 0.028, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.31988041853512705, 'recall': 0.20093896713615023, 'f1': 0.24682814302191464, 'number': 1065} 0.1728 0.1179 0.1402 0.3707
1.4041 2.0 20 1.1664 {'precision': 0.15247252747252749, 'recall': 0.13720642768850433, 'f1': 0.14443721535458687, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.46790409899458624, 'recall': 0.568075117370892, 'f1': 0.5131467345207804, 'number': 1065} 0.3539 0.3593 0.3566 0.6164
1.0549 3.0 30 0.8895 {'precision': 0.521044992743106, 'recall': 0.4437577255871446, 'f1': 0.479305740987984, 'number': 809} {'precision': 0.25, 'recall': 0.08403361344537816, 'f1': 0.12578616352201258, 'number': 119} {'precision': 0.5932336742722266, 'recall': 0.707981220657277, 'f1': 0.6455479452054794, 'number': 1065} 0.5615 0.5635 0.5625 0.7226
0.8144 4.0 40 0.7445 {'precision': 0.621978021978022, 'recall': 0.6996291718170581, 'f1': 0.658522396742292, 'number': 809} {'precision': 0.2753623188405797, 'recall': 0.15966386554621848, 'f1': 0.20212765957446807, 'number': 119} {'precision': 0.6641477749790092, 'recall': 0.7427230046948357, 'f1': 0.7012411347517731, 'number': 1065} 0.6341 0.6904 0.6611 0.7620
0.6601 5.0 50 0.6786 {'precision': 0.6608505997818975, 'recall': 0.7490729295426453, 'f1': 0.7022016222479722, 'number': 809} {'precision': 0.34615384615384615, 'recall': 0.226890756302521, 'f1': 0.27411167512690354, 'number': 119} {'precision': 0.6853932584269663, 'recall': 0.8018779342723005, 'f1': 0.7390739939420164, 'number': 1065} 0.6635 0.7461 0.7024 0.7912
0.558 6.0 60 0.6751 {'precision': 0.6495375128468653, 'recall': 0.7812113720642769, 'f1': 0.7093153759820426, 'number': 809} {'precision': 0.34615384615384615, 'recall': 0.226890756302521, 'f1': 0.27411167512690354, 'number': 119} {'precision': 0.7348017621145374, 'recall': 0.7830985915492957, 'f1': 0.7581818181818181, 'number': 1065} 0.6830 0.7491 0.7145 0.7873
0.4876 7.0 70 0.6439 {'precision': 0.6867469879518072, 'recall': 0.7750309023485785, 'f1': 0.7282229965156795, 'number': 809} {'precision': 0.2672413793103448, 'recall': 0.2605042016806723, 'f1': 0.26382978723404255, 'number': 119} {'precision': 0.735144312393888, 'recall': 0.8131455399061033, 'f1': 0.7721801159161837, 'number': 1065} 0.6905 0.7647 0.7257 0.8059
0.431 8.0 80 0.6333 {'precision': 0.7019650655021834, 'recall': 0.7948084054388134, 'f1': 0.7455072463768115, 'number': 809} {'precision': 0.3157894736842105, 'recall': 0.3025210084033613, 'f1': 0.30901287553648066, 'number': 119} {'precision': 0.7440878378378378, 'recall': 0.8272300469483568, 'f1': 0.7834593152512227, 'number': 1065} 0.7046 0.7827 0.7416 0.8119
0.3849 9.0 90 0.6338 {'precision': 0.713495575221239, 'recall': 0.7972805933250927, 'f1': 0.7530647985989491, 'number': 809} {'precision': 0.3465346534653465, 'recall': 0.29411764705882354, 'f1': 0.3181818181818182, 'number': 119} {'precision': 0.7697022767075307, 'recall': 0.8253521126760563, 'f1': 0.7965564114182148, 'number': 1065} 0.7261 0.7822 0.7531 0.8189
0.3741 10.0 100 0.6533 {'precision': 0.7054429028815368, 'recall': 0.8170580964153276, 'f1': 0.7571592210767468, 'number': 809} {'precision': 0.31092436974789917, 'recall': 0.31092436974789917, 'f1': 0.31092436974789917, 'number': 119} {'precision': 0.7736185383244206, 'recall': 0.8150234741784037, 'f1': 0.7937814357567444, 'number': 1065} 0.7190 0.7858 0.7509 0.8133
0.3184 11.0 110 0.6556 {'precision': 0.7065803667745415, 'recall': 0.8096415327564895, 'f1': 0.7546082949308756, 'number': 809} {'precision': 0.3203125, 'recall': 0.3445378151260504, 'f1': 0.33198380566801616, 'number': 119} {'precision': 0.7630901287553649, 'recall': 0.8347417840375587, 'f1': 0.7973094170403587, 'number': 1065} 0.7140 0.7953 0.7524 0.8104
0.3038 12.0 120 0.6681 {'precision': 0.72271714922049, 'recall': 0.8022249690976514, 'f1': 0.7603983596953721, 'number': 809} {'precision': 0.3305084745762712, 'recall': 0.3277310924369748, 'f1': 0.32911392405063294, 'number': 119} {'precision': 0.7851387645478961, 'recall': 0.8234741784037559, 'f1': 0.8038496791934006, 'number': 1065} 0.7337 0.7852 0.7586 0.8155
0.2922 13.0 130 0.6667 {'precision': 0.7233809001097695, 'recall': 0.8145859085290482, 'f1': 0.7662790697674419, 'number': 809} {'precision': 0.36036036036036034, 'recall': 0.33613445378151263, 'f1': 0.34782608695652173, 'number': 119} {'precision': 0.7810599478714162, 'recall': 0.844131455399061, 'f1': 0.8113718411552348, 'number': 1065} 0.7354 0.8018 0.7672 0.8150
0.2685 14.0 140 0.6738 {'precision': 0.7296996662958843, 'recall': 0.8108776266996292, 'f1': 0.7681498829039812, 'number': 809} {'precision': 0.3384615384615385, 'recall': 0.3697478991596639, 'f1': 0.35341365461847385, 'number': 119} {'precision': 0.7788546255506608, 'recall': 0.8300469483568075, 'f1': 0.8036363636363637, 'number': 1065} 0.7320 0.7948 0.7621 0.8131
0.2668 15.0 150 0.6748 {'precision': 0.7245575221238938, 'recall': 0.8096415327564895, 'f1': 0.7647402218330415, 'number': 809} {'precision': 0.3464566929133858, 'recall': 0.3697478991596639, 'f1': 0.35772357723577236, 'number': 119} {'precision': 0.7756183745583038, 'recall': 0.8244131455399061, 'f1': 0.7992717341829768, 'number': 1065} 0.7291 0.7913 0.7589 0.8136

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.4.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1