Edit model card

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1459
  • Answer: {'precision': 0.3920704845814978, 'recall': 0.5500618046971569, 'f1': 0.45781893004115226, 'number': 809}
  • Header: {'precision': 0.36363636363636365, 'recall': 0.2689075630252101, 'f1': 0.30917874396135264, 'number': 119}
  • Question: {'precision': 0.5136876006441223, 'recall': 0.5990610328638498, 'f1': 0.553099263112267, 'number': 1065}
  • Overall Precision: 0.4523
  • Overall Recall: 0.5595
  • Overall F1: 0.5002
  • Overall Accuracy: 0.6006

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7425 1.0 10 1.4798 {'precision': 0.05438311688311688, 'recall': 0.08281829419035847, 'f1': 0.06565409113179814, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.2244367417677643, 'recall': 0.2431924882629108, 'f1': 0.2334384858044164, 'number': 1065} 0.1366 0.1636 0.1489 0.3756
1.419 2.0 20 1.3167 {'precision': 0.21116377040547657, 'recall': 0.4956736711990111, 'f1': 0.2961595273264402, 'number': 809} {'precision': 0.08888888888888889, 'recall': 0.03361344537815126, 'f1': 0.048780487804878044, 'number': 119} {'precision': 0.235467255334805, 'recall': 0.3004694835680751, 'f1': 0.264026402640264, 'number': 1065} 0.2195 0.3638 0.2738 0.4192
1.2741 3.0 30 1.2387 {'precision': 0.2594221105527638, 'recall': 0.5105067985166872, 'f1': 0.34402332361516036, 'number': 809} {'precision': 0.2702702702702703, 'recall': 0.16806722689075632, 'f1': 0.2072538860103627, 'number': 119} {'precision': 0.34717494894486045, 'recall': 0.4788732394366197, 'f1': 0.4025256511444357, 'number': 1065} 0.3008 0.4732 0.3678 0.4611
1.147 4.0 40 1.1190 {'precision': 0.26329113924050634, 'recall': 0.5142150803461063, 'f1': 0.34826287149434904, 'number': 809} {'precision': 0.28, 'recall': 0.17647058823529413, 'f1': 0.21649484536082475, 'number': 119} {'precision': 0.4030188679245283, 'recall': 0.5014084507042254, 'f1': 0.44686192468619246, 'number': 1065} 0.3258 0.4872 0.3905 0.5426
1.0331 5.0 50 1.1534 {'precision': 0.2893436838390967, 'recall': 0.5067985166872683, 'f1': 0.36837376460017973, 'number': 809} {'precision': 0.2876712328767123, 'recall': 0.17647058823529413, 'f1': 0.21875000000000003, 'number': 119} {'precision': 0.4215817694369973, 'recall': 0.5906103286384976, 'f1': 0.4919827923347672, 'number': 1065} 0.3555 0.5319 0.4261 0.5476
0.9715 6.0 60 1.1035 {'precision': 0.3210227272727273, 'recall': 0.5587144622991347, 'f1': 0.4077582318448354, 'number': 809} {'precision': 0.3157894736842105, 'recall': 0.15126050420168066, 'f1': 0.2045454545454545, 'number': 119} {'precision': 0.46368243243243246, 'recall': 0.5154929577464789, 'f1': 0.4882169853268119, 'number': 1065} 0.3847 0.5113 0.4390 0.5706
0.8925 7.0 70 1.0616 {'precision': 0.3607266435986159, 'recall': 0.515451174289246, 'f1': 0.42442748091603055, 'number': 809} {'precision': 0.29473684210526313, 'recall': 0.23529411764705882, 'f1': 0.2616822429906542, 'number': 119} {'precision': 0.4845360824742268, 'recall': 0.5737089201877934, 'f1': 0.52536543422184, 'number': 1065} 0.4204 0.5299 0.4688 0.5874
0.8174 8.0 80 1.0694 {'precision': 0.3473507148864592, 'recall': 0.5105067985166872, 'f1': 0.4134134134134134, 'number': 809} {'precision': 0.3373493975903614, 'recall': 0.23529411764705882, 'f1': 0.2772277227722772, 'number': 119} {'precision': 0.4794414274631497, 'recall': 0.5802816901408451, 'f1': 0.5250637213254036, 'number': 1065} 0.4135 0.5314 0.4651 0.5893
0.7698 9.0 90 1.1272 {'precision': 0.35641227380015733, 'recall': 0.5599505562422744, 'f1': 0.43557692307692303, 'number': 809} {'precision': 0.3493975903614458, 'recall': 0.24369747899159663, 'f1': 0.2871287128712871, 'number': 119} {'precision': 0.5008818342151675, 'recall': 0.5333333333333333, 'f1': 0.5165984538426557, 'number': 1065} 0.4220 0.5268 0.4686 0.5817
0.7676 10.0 100 1.1380 {'precision': 0.37153088630259623, 'recall': 0.5129789864029666, 'f1': 0.43094496365524404, 'number': 809} {'precision': 0.29523809523809524, 'recall': 0.2605042016806723, 'f1': 0.2767857142857143, 'number': 119} {'precision': 0.5185185185185185, 'recall': 0.5784037558685446, 'f1': 0.546826453617399, 'number': 1065} 0.4407 0.5329 0.4824 0.5958
0.6932 11.0 110 1.1051 {'precision': 0.387, 'recall': 0.4783683559950556, 'f1': 0.42786069651741293, 'number': 809} {'precision': 0.37037037037037035, 'recall': 0.25210084033613445, 'f1': 0.3, 'number': 119} {'precision': 0.4865061998541211, 'recall': 0.6262910798122066, 'f1': 0.5476190476190477, 'number': 1065} 0.4421 0.5439 0.4877 0.6026
0.6856 12.0 120 1.1257 {'precision': 0.38833181403828626, 'recall': 0.5265760197775031, 'f1': 0.44700944386149, 'number': 809} {'precision': 0.3409090909090909, 'recall': 0.25210084033613445, 'f1': 0.2898550724637681, 'number': 119} {'precision': 0.48674521354933725, 'recall': 0.6206572769953052, 'f1': 0.545604622368964, 'number': 1065} 0.4392 0.5605 0.4925 0.6021
0.6592 13.0 130 1.1253 {'precision': 0.39461883408071746, 'recall': 0.5438813349814586, 'f1': 0.4573804573804573, 'number': 809} {'precision': 0.3614457831325301, 'recall': 0.25210084033613445, 'f1': 0.297029702970297, 'number': 119} {'precision': 0.5112179487179487, 'recall': 0.5990610328638498, 'f1': 0.5516645049718979, 'number': 1065} 0.4530 0.5559 0.4992 0.6066
0.6358 14.0 140 1.1420 {'precision': 0.3906810035842294, 'recall': 0.5389369592088998, 'f1': 0.452987012987013, 'number': 809} {'precision': 0.36904761904761907, 'recall': 0.2605042016806723, 'f1': 0.30541871921182273, 'number': 119} {'precision': 0.5062597809076682, 'recall': 0.6075117370892019, 'f1': 0.5522833973538199, 'number': 1065} 0.4496 0.5590 0.4983 0.6018
0.6263 15.0 150 1.1459 {'precision': 0.3920704845814978, 'recall': 0.5500618046971569, 'f1': 0.45781893004115226, 'number': 809} {'precision': 0.36363636363636365, 'recall': 0.2689075630252101, 'f1': 0.30917874396135264, 'number': 119} {'precision': 0.5136876006441223, 'recall': 0.5990610328638498, 'f1': 0.553099263112267, 'number': 1065} 0.4523 0.5595 0.5002 0.6006

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
6
Safetensors
Model size
113M params
Tensor type
F32
·

Finetuned from