layoutlm-funsd / README.md
lmurray's picture
End of training
527d520 verified
metadata
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6766
  • Answer: {'precision': 0.6937033084311632, 'recall': 0.8034610630407911, 'f1': 0.7445589919816725, 'number': 809}
  • Header: {'precision': 0.34146341463414637, 'recall': 0.35294117647058826, 'f1': 0.34710743801652894, 'number': 119}
  • Question: {'precision': 0.7879858657243817, 'recall': 0.8375586854460094, 'f1': 0.812016385980883, 'number': 1065}
  • Overall Precision: 0.7226
  • Overall Recall: 0.7948
  • Overall F1: 0.7570
  • Overall Accuracy: 0.8076

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7798 1.0 10 1.5839 {'precision': 0.01451378809869376, 'recall': 0.012360939431396786, 'f1': 0.01335113484646195, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.1924342105263158, 'recall': 0.10985915492957747, 'f1': 0.1398684997011357, 'number': 1065} 0.0979 0.0637 0.0772 0.3437
1.4432 2.0 20 1.2365 {'precision': 0.24876604146100692, 'recall': 0.311495673671199, 'f1': 0.27661909989023054, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.4279411764705882, 'recall': 0.5464788732394367, 'f1': 0.48000000000000004, 'number': 1065} 0.3515 0.4185 0.3820 0.5956
1.0756 3.0 30 0.9004 {'precision': 0.5348399246704332, 'recall': 0.7021013597033374, 'f1': 0.6071619454836986, 'number': 809} {'precision': 0.03225806451612903, 'recall': 0.008403361344537815, 'f1': 0.013333333333333332, 'number': 119} {'precision': 0.5821917808219178, 'recall': 0.7183098591549296, 'f1': 0.6431273644388399, 'number': 1065} 0.5542 0.6693 0.6064 0.7160
0.8288 4.0 40 0.7621 {'precision': 0.6007984031936128, 'recall': 0.7441285537700866, 'f1': 0.6648260629486471, 'number': 809} {'precision': 0.10526315789473684, 'recall': 0.06722689075630252, 'f1': 0.08205128205128205, 'number': 119} {'precision': 0.6632825719120136, 'recall': 0.7361502347417841, 'f1': 0.6978193146417446, 'number': 1065} 0.6168 0.6994 0.6555 0.7647
0.6673 5.0 50 0.7300 {'precision': 0.6344314558979809, 'recall': 0.7379480840543882, 'f1': 0.6822857142857143, 'number': 809} {'precision': 0.2247191011235955, 'recall': 0.16806722689075632, 'f1': 0.19230769230769232, 'number': 119} {'precision': 0.6439909297052154, 'recall': 0.8, 'f1': 0.71356783919598, 'number': 1065} 0.6243 0.7371 0.6760 0.7704
0.5732 6.0 60 0.6729 {'precision': 0.6424974823766365, 'recall': 0.788627935723115, 'f1': 0.7081021087680356, 'number': 809} {'precision': 0.21, 'recall': 0.17647058823529413, 'f1': 0.19178082191780824, 'number': 119} {'precision': 0.7048494983277592, 'recall': 0.7915492957746478, 'f1': 0.7456877487837241, 'number': 1065} 0.6562 0.7536 0.7015 0.7928
0.5005 7.0 70 0.6515 {'precision': 0.6611740473738414, 'recall': 0.7935723114956736, 'f1': 0.7213483146067416, 'number': 809} {'precision': 0.2288135593220339, 'recall': 0.226890756302521, 'f1': 0.22784810126582278, 'number': 119} {'precision': 0.7432784041630529, 'recall': 0.8046948356807512, 'f1': 0.7727682596934174, 'number': 1065} 0.6806 0.7657 0.7207 0.8005
0.4473 8.0 80 0.6629 {'precision': 0.6748717948717948, 'recall': 0.8133498145859085, 'f1': 0.7376681614349776, 'number': 809} {'precision': 0.27586206896551724, 'recall': 0.2689075630252101, 'f1': 0.27234042553191484, 'number': 119} {'precision': 0.7614520311149524, 'recall': 0.8272300469483568, 'f1': 0.7929792979297929, 'number': 1065} 0.6988 0.7883 0.7409 0.7983
0.3953 9.0 90 0.6507 {'precision': 0.6750524109014675, 'recall': 0.796044499381953, 'f1': 0.7305728871242201, 'number': 809} {'precision': 0.3008849557522124, 'recall': 0.2857142857142857, 'f1': 0.29310344827586204, 'number': 119} {'precision': 0.7689625108979947, 'recall': 0.828169014084507, 'f1': 0.7974683544303798, 'number': 1065} 0.7046 0.7827 0.7416 0.8031
0.3926 10.0 100 0.6539 {'precision': 0.6704663212435233, 'recall': 0.799752781211372, 'f1': 0.729425028184893, 'number': 809} {'precision': 0.3217391304347826, 'recall': 0.31092436974789917, 'f1': 0.3162393162393162, 'number': 119} {'precision': 0.7722513089005235, 'recall': 0.8309859154929577, 'f1': 0.8005427408412483, 'number': 1065} 0.7049 0.7873 0.7438 0.8046
0.3344 11.0 110 0.6626 {'precision': 0.6831578947368421, 'recall': 0.8022249690976514, 'f1': 0.7379192723138146, 'number': 809} {'precision': 0.3140495867768595, 'recall': 0.31932773109243695, 'f1': 0.31666666666666665, 'number': 119} {'precision': 0.7677029360967185, 'recall': 0.8347417840375587, 'f1': 0.7998200629779576, 'number': 1065} 0.7070 0.7908 0.7466 0.8054
0.316 12.0 120 0.6678 {'precision': 0.6871035940803383, 'recall': 0.8034610630407911, 'f1': 0.7407407407407406, 'number': 809} {'precision': 0.31666666666666665, 'recall': 0.31932773109243695, 'f1': 0.3179916317991632, 'number': 119} {'precision': 0.7860300618921309, 'recall': 0.8347417840375587, 'f1': 0.8096539162112932, 'number': 1065} 0.7178 0.7913 0.7527 0.8071
0.3001 13.0 130 0.6758 {'precision': 0.6856540084388185, 'recall': 0.8034610630407911, 'f1': 0.7398975526465565, 'number': 809} {'precision': 0.325, 'recall': 0.3277310924369748, 'f1': 0.3263598326359833, 'number': 119} {'precision': 0.792, 'recall': 0.8366197183098592, 'f1': 0.8136986301369863, 'number': 1065} 0.7205 0.7928 0.7549 0.8082
0.279 14.0 140 0.6750 {'precision': 0.6875, 'recall': 0.8022249690976514, 'f1': 0.7404449515116942, 'number': 809} {'precision': 0.31451612903225806, 'recall': 0.3277310924369748, 'f1': 0.32098765432098764, 'number': 119} {'precision': 0.7816901408450704, 'recall': 0.8338028169014085, 'f1': 0.8069059518400727, 'number': 1065} 0.7151 0.7908 0.7510 0.8066
0.2793 15.0 150 0.6766 {'precision': 0.6937033084311632, 'recall': 0.8034610630407911, 'f1': 0.7445589919816725, 'number': 809} {'precision': 0.34146341463414637, 'recall': 0.35294117647058826, 'f1': 0.34710743801652894, 'number': 119} {'precision': 0.7879858657243817, 'recall': 0.8375586854460094, 'f1': 0.812016385980883, 'number': 1065} 0.7226 0.7948 0.7570 0.8076

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1