Edit model card

layoutlm-funsd-pytorch

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7042
  • Answer: {'precision': 0.712403951701427, 'recall': 0.8022249690976514, 'f1': 0.7546511627906977, 'number': 809}
  • Header: {'precision': 0.3203125, 'recall': 0.3445378151260504, 'f1': 0.33198380566801616, 'number': 119}
  • Question: {'precision': 0.7747589833479404, 'recall': 0.8300469483568075, 'f1': 0.8014505893019038, 'number': 1065}
  • Overall Precision: 0.7220
  • Overall Recall: 0.7898
  • Overall F1: 0.7544
  • Overall Accuracy: 0.8078

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7641 1.0 10 1.5569 {'precision': 0.01979045401629802, 'recall': 0.021013597033374538, 'f1': 0.02038369304556355, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.20930232558139536, 'recall': 0.15211267605633802, 'f1': 0.1761827079934747, 'number': 1065} 0.1096 0.0898 0.0987 0.3917
1.4096 2.0 20 1.1718 {'precision': 0.18729096989966554, 'recall': 0.138442521631644, 'f1': 0.15920398009950248, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.4800601956358164, 'recall': 0.5990610328638498, 'f1': 0.5329991645781119, 'number': 1065} 0.3892 0.3763 0.3827 0.6045
1.0362 3.0 30 0.9322 {'precision': 0.5212620027434842, 'recall': 0.46971569839307786, 'f1': 0.494148244473342, 'number': 809} {'precision': 0.10344827586206896, 'recall': 0.025210084033613446, 'f1': 0.040540540540540536, 'number': 119} {'precision': 0.6362847222222222, 'recall': 0.6882629107981221, 'f1': 0.661253946774921, 'number': 1065} 0.5843 0.5600 0.5719 0.7091
0.8024 4.0 40 0.7725 {'precision': 0.6457858769931663, 'recall': 0.7008652657601978, 'f1': 0.6721991701244814, 'number': 809} {'precision': 0.1791044776119403, 'recall': 0.10084033613445378, 'f1': 0.12903225806451613, 'number': 119} {'precision': 0.6911130284728214, 'recall': 0.752112676056338, 'f1': 0.7203237410071942, 'number': 1065} 0.6559 0.6924 0.6737 0.7700
0.6483 5.0 50 0.7035 {'precision': 0.6575790621592148, 'recall': 0.7453646477132262, 'f1': 0.6987253765932794, 'number': 809} {'precision': 0.26881720430107525, 'recall': 0.21008403361344538, 'f1': 0.2358490566037736, 'number': 119} {'precision': 0.7120067170445005, 'recall': 0.7962441314553991, 'f1': 0.75177304964539, 'number': 1065} 0.6706 0.7406 0.7039 0.7857
0.5298 6.0 60 0.6747 {'precision': 0.6925601750547046, 'recall': 0.7824474660074165, 'f1': 0.73476494486361, 'number': 809} {'precision': 0.3472222222222222, 'recall': 0.21008403361344538, 'f1': 0.2617801047120419, 'number': 119} {'precision': 0.7333333333333333, 'recall': 0.8366197183098592, 'f1': 0.7815789473684212, 'number': 1065} 0.7038 0.7772 0.7387 0.7984
0.4644 7.0 70 0.6752 {'precision': 0.6750261233019854, 'recall': 0.7985166872682324, 'f1': 0.7315968289920726, 'number': 809} {'precision': 0.29357798165137616, 'recall': 0.2689075630252101, 'f1': 0.28070175438596495, 'number': 119} {'precision': 0.7529812606473595, 'recall': 0.8300469483568075, 'f1': 0.7896382313532827, 'number': 1065} 0.6973 0.7837 0.7380 0.8010
0.4253 8.0 80 0.6664 {'precision': 0.699666295884316, 'recall': 0.7775030902348579, 'f1': 0.7365339578454333, 'number': 809} {'precision': 0.3106796116504854, 'recall': 0.2689075630252101, 'f1': 0.28828828828828823, 'number': 119} {'precision': 0.7704485488126649, 'recall': 0.8225352112676056, 'f1': 0.7956403269754768, 'number': 1065} 0.7186 0.7712 0.7439 0.8017
0.3815 9.0 90 0.6658 {'precision': 0.6973684210526315, 'recall': 0.7861557478368356, 'f1': 0.7391051714119697, 'number': 809} {'precision': 0.3228346456692913, 'recall': 0.3445378151260504, 'f1': 0.3333333333333333, 'number': 119} {'precision': 0.7474916387959866, 'recall': 0.8394366197183099, 'f1': 0.7908005307386111, 'number': 1065} 0.7029 0.7883 0.7431 0.8053
0.3391 10.0 100 0.6736 {'precision': 0.7022900763358778, 'recall': 0.796044499381953, 'f1': 0.7462340672074159, 'number': 809} {'precision': 0.3252032520325203, 'recall': 0.33613445378151263, 'f1': 0.3305785123966942, 'number': 119} {'precision': 0.7681034482758621, 'recall': 0.8366197183098592, 'f1': 0.8008988764044945, 'number': 1065} 0.7159 0.7903 0.7513 0.8073
0.3117 11.0 110 0.6947 {'precision': 0.7086956521739131, 'recall': 0.8059332509270705, 'f1': 0.7541931752458069, 'number': 809} {'precision': 0.3333333333333333, 'recall': 0.3445378151260504, 'f1': 0.33884297520661155, 'number': 119} {'precision': 0.7992667277726856, 'recall': 0.8187793427230047, 'f1': 0.8089053803339518, 'number': 1065} 0.7334 0.7852 0.7584 0.8083
0.2991 12.0 120 0.6963 {'precision': 0.7058823529411765, 'recall': 0.8009888751545118, 'f1': 0.7504342790966995, 'number': 809} {'precision': 0.33064516129032256, 'recall': 0.3445378151260504, 'f1': 0.33744855967078186, 'number': 119} {'precision': 0.7716262975778547, 'recall': 0.8375586854460094, 'f1': 0.8032417829806394, 'number': 1065} 0.7193 0.7933 0.7545 0.8076
0.282 13.0 130 0.6991 {'precision': 0.7153846153846154, 'recall': 0.8046971569839307, 'f1': 0.7574171029668412, 'number': 809} {'precision': 0.336, 'recall': 0.35294117647058826, 'f1': 0.3442622950819672, 'number': 119} {'precision': 0.7898032200357782, 'recall': 0.8291079812206573, 'f1': 0.8089784699954191, 'number': 1065} 0.7320 0.7908 0.7603 0.8102
0.2722 14.0 140 0.7044 {'precision': 0.712253829321663, 'recall': 0.8046971569839307, 'f1': 0.7556587347649449, 'number': 809} {'precision': 0.3228346456692913, 'recall': 0.3445378151260504, 'f1': 0.3333333333333333, 'number': 119} {'precision': 0.7811120917917035, 'recall': 0.8309859154929577, 'f1': 0.8052775250227479, 'number': 1065} 0.7254 0.7913 0.7569 0.8081
0.2634 15.0 150 0.7042 {'precision': 0.712403951701427, 'recall': 0.8022249690976514, 'f1': 0.7546511627906977, 'number': 809} {'precision': 0.3203125, 'recall': 0.3445378151260504, 'f1': 0.33198380566801616, 'number': 119} {'precision': 0.7747589833479404, 'recall': 0.8300469483568075, 'f1': 0.8014505893019038, 'number': 1065} 0.7220 0.7898 0.7544 0.8078

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.12.1
  • Datasets 2.6.1
  • Tokenizers 0.13.1
Downloads last month
2