Edit model card

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0534
  • Answer: {'precision': 0.38023152270703475, 'recall': 0.5278121137206427, 'f1': 0.44202898550724634, 'number': 809}
  • Header: {'precision': 0.3333333333333333, 'recall': 0.24369747899159663, 'f1': 0.2815533980582524, 'number': 119}
  • Question: {'precision': 0.5214341387373344, 'recall': 0.6281690140845071, 'f1': 0.5698466780238501, 'number': 1065}
  • Overall Precision: 0.4513
  • Overall Recall: 0.5645
  • Overall F1: 0.5016
  • Overall Accuracy: 0.6341

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7733 1.0 10 1.5779 {'precision': 0.03243847874720358, 'recall': 0.03584672435105068, 'f1': 0.03405754550792719, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.2723926380368098, 'recall': 0.2084507042253521, 'f1': 0.23617021276595745, 'number': 1065} 0.1469 0.1259 0.1356 0.3498
1.4958 2.0 20 1.3947 {'precision': 0.15568475452196381, 'recall': 0.2978986402966625, 'f1': 0.20449724225710647, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.24971493728620298, 'recall': 0.4112676056338028, 'f1': 0.310748492373182, 'number': 1065} 0.2047 0.3407 0.2557 0.4093
1.32 3.0 30 1.2259 {'precision': 0.2251798561151079, 'recall': 0.3868974042027194, 'f1': 0.28467485220554795, 'number': 809} {'precision': 0.09090909090909091, 'recall': 0.05042016806722689, 'f1': 0.06486486486486487, 'number': 119} {'precision': 0.3336864406779661, 'recall': 0.5915492957746479, 'f1': 0.4266847273958686, 'number': 1065} 0.2838 0.4762 0.3556 0.4708
1.1874 4.0 40 1.1299 {'precision': 0.25460992907801416, 'recall': 0.4437577255871446, 'f1': 0.3235691753041911, 'number': 809} {'precision': 0.30864197530864196, 'recall': 0.21008403361344538, 'f1': 0.25, 'number': 119} {'precision': 0.3852813852813853, 'recall': 0.5849765258215962, 'f1': 0.4645786726323639, 'number': 1065} 0.3240 0.5053 0.3948 0.5607
1.079 5.0 50 1.0967 {'precision': 0.28809523809523807, 'recall': 0.44870210135970334, 'f1': 0.35089415176413724, 'number': 809} {'precision': 0.3170731707317073, 'recall': 0.2184873949579832, 'f1': 0.25870646766169153, 'number': 119} {'precision': 0.4067073170731707, 'recall': 0.6262910798122066, 'f1': 0.4931608133086876, 'number': 1065} 0.3541 0.5299 0.4245 0.5684
1.0153 6.0 60 1.0661 {'precision': 0.32075471698113206, 'recall': 0.5043263288009888, 'f1': 0.39211917347429115, 'number': 809} {'precision': 0.33783783783783783, 'recall': 0.21008403361344538, 'f1': 0.25906735751295334, 'number': 119} {'precision': 0.5031055900621118, 'recall': 0.532394366197183, 'f1': 0.5173357664233575, 'number': 1065} 0.4044 0.5018 0.4478 0.5887
0.9487 7.0 70 1.0371 {'precision': 0.3273753527751646, 'recall': 0.43016069221260816, 'f1': 0.37179487179487175, 'number': 809} {'precision': 0.28440366972477066, 'recall': 0.2605042016806723, 'f1': 0.2719298245614035, 'number': 119} {'precision': 0.44015696533682147, 'recall': 0.631924882629108, 'f1': 0.5188897455666924, 'number': 1065} 0.3895 0.5278 0.4482 0.5965
0.8939 8.0 80 1.0279 {'precision': 0.3353711790393013, 'recall': 0.4746600741656366, 'f1': 0.39303991811668376, 'number': 809} {'precision': 0.4166666666666667, 'recall': 0.21008403361344538, 'f1': 0.2793296089385475, 'number': 119} {'precision': 0.4401008827238335, 'recall': 0.6553990610328638, 'f1': 0.5265937382119954, 'number': 1065} 0.3966 0.5554 0.4628 0.6073
0.8226 9.0 90 1.0434 {'precision': 0.36496980155306297, 'recall': 0.522867737948084, 'f1': 0.4298780487804878, 'number': 809} {'precision': 0.2765957446808511, 'recall': 0.2184873949579832, 'f1': 0.24413145539906103, 'number': 119} {'precision': 0.524451939291737, 'recall': 0.584037558685446, 'f1': 0.5526432696579298, 'number': 1065} 0.4391 0.5374 0.4833 0.6047
0.8109 10.0 100 1.0504 {'precision': 0.3830755232029117, 'recall': 0.5203955500618047, 'f1': 0.44129979035639416, 'number': 809} {'precision': 0.3258426966292135, 'recall': 0.24369747899159663, 'f1': 0.27884615384615385, 'number': 119} {'precision': 0.5186104218362283, 'recall': 0.5887323943661972, 'f1': 0.5514511873350924, 'number': 1065} 0.4493 0.5404 0.4907 0.6087
0.7313 11.0 110 1.0353 {'precision': 0.35545454545454547, 'recall': 0.48331273176761436, 'f1': 0.4096385542168675, 'number': 809} {'precision': 0.34615384615384615, 'recall': 0.226890756302521, 'f1': 0.27411167512690354, 'number': 119} {'precision': 0.486411149825784, 'recall': 0.6553990610328638, 'f1': 0.5584, 'number': 1065} 0.4271 0.5600 0.4846 0.6283
0.7183 12.0 120 1.0649 {'precision': 0.3668639053254438, 'recall': 0.5364647713226205, 'f1': 0.43574297188755023, 'number': 809} {'precision': 0.35802469135802467, 'recall': 0.24369747899159663, 'f1': 0.29000000000000004, 'number': 119} {'precision': 0.5118483412322274, 'recall': 0.6084507042253521, 'f1': 0.5559845559845559, 'number': 1065} 0.4391 0.5575 0.4913 0.6293
0.6865 13.0 130 1.0692 {'precision': 0.37521514629948366, 'recall': 0.5389369592088998, 'f1': 0.44241501775748354, 'number': 809} {'precision': 0.38461538461538464, 'recall': 0.25210084033613445, 'f1': 0.30456852791878175, 'number': 119} {'precision': 0.5404255319148936, 'recall': 0.596244131455399, 'f1': 0.5669642857142857, 'number': 1065} 0.4559 0.5524 0.4995 0.6258
0.6566 14.0 140 1.0435 {'precision': 0.3845446182152714, 'recall': 0.5166872682323856, 'f1': 0.4409282700421941, 'number': 809} {'precision': 0.3488372093023256, 'recall': 0.25210084033613445, 'f1': 0.2926829268292683, 'number': 119} {'precision': 0.5181747873163186, 'recall': 0.6291079812206573, 'f1': 0.568278201865988, 'number': 1065} 0.4534 0.5610 0.5015 0.6295
0.6437 15.0 150 1.0534 {'precision': 0.38023152270703475, 'recall': 0.5278121137206427, 'f1': 0.44202898550724634, 'number': 809} {'precision': 0.3333333333333333, 'recall': 0.24369747899159663, 'f1': 0.2815533980582524, 'number': 119} {'precision': 0.5214341387373344, 'recall': 0.6281690140845071, 'f1': 0.5698466780238501, 'number': 1065} 0.4513 0.5645 0.5016 0.6341

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
6
Safetensors
Model size
113M params
Tensor type
F32
·

Finetuned from