layoutlm-funsd / README.md
Mocha2471's picture
End of training
65c6ffc verified
metadata
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6739
  • Answer: {'precision': 0.7077087794432548, 'recall': 0.8170580964153276, 'f1': 0.7584624211130234, 'number': 809}
  • Header: {'precision': 0.30656934306569344, 'recall': 0.35294117647058826, 'f1': 0.32812500000000006, 'number': 119}
  • Question: {'precision': 0.7837354781054513, 'recall': 0.8234741784037559, 'f1': 0.8031135531135531, 'number': 1065}
  • Overall Precision: 0.7215
  • Overall Recall: 0.7928
  • Overall F1: 0.7554
  • Overall Accuracy: 0.8075

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7578 1.0 10 1.5659 {'precision': 0.020053475935828877, 'recall': 0.018541409147095178, 'f1': 0.01926782273603083, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.311886586695747, 'recall': 0.26854460093896715, 'f1': 0.2885973763874874, 'number': 1065} 0.1808 0.1510 0.1646 0.3760
1.409 2.0 20 1.2205 {'precision': 0.220795892169448, 'recall': 0.2126081582200247, 'f1': 0.21662468513853905, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.43257184966838613, 'recall': 0.5511737089201878, 'f1': 0.4847233691164327, 'number': 1065} 0.3553 0.3808 0.3676 0.5932
1.0728 3.0 30 0.9396 {'precision': 0.5072765072765073, 'recall': 0.6032138442521632, 'f1': 0.5511010728402033, 'number': 809} {'precision': 0.02702702702702703, 'recall': 0.008403361344537815, 'f1': 0.01282051282051282, 'number': 119} {'precision': 0.5947242206235012, 'recall': 0.6985915492957746, 'f1': 0.6424870466321244, 'number': 1065} 0.548 0.6187 0.5812 0.7236
0.8188 4.0 40 0.7725 {'precision': 0.6076845298281092, 'recall': 0.7428924598269468, 'f1': 0.6685205784204672, 'number': 809} {'precision': 0.18, 'recall': 0.07563025210084033, 'f1': 0.10650887573964496, 'number': 119} {'precision': 0.6797608881298036, 'recall': 0.7474178403755869, 'f1': 0.7119856887298748, 'number': 1065} 0.6362 0.7055 0.6690 0.7680
0.6647 5.0 50 0.7205 {'precision': 0.6301806588735388, 'recall': 0.7330037082818294, 'f1': 0.6777142857142857, 'number': 809} {'precision': 0.22093023255813954, 'recall': 0.15966386554621848, 'f1': 0.18536585365853656, 'number': 119} {'precision': 0.6648731744811683, 'recall': 0.812206572769953, 'f1': 0.7311918850380389, 'number': 1065} 0.6345 0.7411 0.6836 0.7775
0.5719 6.0 60 0.6793 {'precision': 0.6366336633663366, 'recall': 0.7948084054388134, 'f1': 0.7069818581638262, 'number': 809} {'precision': 0.25301204819277107, 'recall': 0.17647058823529413, 'f1': 0.20792079207920794, 'number': 119} {'precision': 0.7342342342342343, 'recall': 0.7652582159624414, 'f1': 0.749425287356322, 'number': 1065} 0.6714 0.7421 0.7050 0.7826
0.5011 7.0 70 0.6617 {'precision': 0.6697819314641744, 'recall': 0.7972805933250927, 'f1': 0.7279909706546276, 'number': 809} {'precision': 0.24347826086956523, 'recall': 0.23529411764705882, 'f1': 0.23931623931623933, 'number': 119} {'precision': 0.7497773820124666, 'recall': 0.7906103286384977, 'f1': 0.7696526508226691, 'number': 1065} 0.6883 0.7602 0.7225 0.7929
0.4478 8.0 80 0.6529 {'precision': 0.6725755995828988, 'recall': 0.7972805933250927, 'f1': 0.7296380090497737, 'number': 809} {'precision': 0.23577235772357724, 'recall': 0.24369747899159663, 'f1': 0.23966942148760334, 'number': 119} {'precision': 0.7578397212543554, 'recall': 0.8169014084507042, 'f1': 0.7862629914143697, 'number': 1065} 0.6924 0.7747 0.7312 0.8001
0.3901 9.0 90 0.6513 {'precision': 0.6936353829557713, 'recall': 0.7948084054388134, 'f1': 0.7407834101382489, 'number': 809} {'precision': 0.27906976744186046, 'recall': 0.3025210084033613, 'f1': 0.29032258064516125, 'number': 119} {'precision': 0.7517123287671232, 'recall': 0.8244131455399061, 'f1': 0.7863860277653381, 'number': 1065} 0.7001 0.7812 0.7384 0.8034
0.3881 10.0 100 0.6564 {'precision': 0.685890834191555, 'recall': 0.823238566131026, 'f1': 0.7483146067415729, 'number': 809} {'precision': 0.3063063063063063, 'recall': 0.2857142857142857, 'f1': 0.2956521739130435, 'number': 119} {'precision': 0.7702582368655387, 'recall': 0.812206572769953, 'f1': 0.7906764168190127, 'number': 1065} 0.7098 0.7852 0.7456 0.8075
0.3249 11.0 110 0.6580 {'precision': 0.7036247334754797, 'recall': 0.8158220024721878, 'f1': 0.755580995993131, 'number': 809} {'precision': 0.31007751937984496, 'recall': 0.33613445378151263, 'f1': 0.3225806451612903, 'number': 119} {'precision': 0.7693646649260226, 'recall': 0.8300469483568075, 'f1': 0.7985546522131888, 'number': 1065} 0.7148 0.7948 0.7527 0.8088
0.3099 12.0 120 0.6646 {'precision': 0.7090909090909091, 'recall': 0.8195302843016069, 'f1': 0.7603211009174312, 'number': 809} {'precision': 0.29411764705882354, 'recall': 0.33613445378151263, 'f1': 0.3137254901960785, 'number': 119} {'precision': 0.7797672336615935, 'recall': 0.8178403755868544, 'f1': 0.7983501374885427, 'number': 1065} 0.7194 0.7898 0.7529 0.8098
0.2907 13.0 130 0.6653 {'precision': 0.7141316073354909, 'recall': 0.8182941903584673, 'f1': 0.7626728110599078, 'number': 809} {'precision': 0.3125, 'recall': 0.33613445378151263, 'f1': 0.3238866396761134, 'number': 119} {'precision': 0.7902790279027903, 'recall': 0.8244131455399061, 'f1': 0.806985294117647, 'number': 1065} 0.7295 0.7928 0.7598 0.8104
0.2715 14.0 140 0.6720 {'precision': 0.71259418729817, 'recall': 0.8182941903584673, 'f1': 0.761795166858458, 'number': 809} {'precision': 0.31343283582089554, 'recall': 0.35294117647058826, 'f1': 0.3320158102766798, 'number': 119} {'precision': 0.7867383512544803, 'recall': 0.8244131455399061, 'f1': 0.8051352590554791, 'number': 1065} 0.7260 0.7938 0.7584 0.8078
0.2743 15.0 150 0.6739 {'precision': 0.7077087794432548, 'recall': 0.8170580964153276, 'f1': 0.7584624211130234, 'number': 809} {'precision': 0.30656934306569344, 'recall': 0.35294117647058826, 'f1': 0.32812500000000006, 'number': 119} {'precision': 0.7837354781054513, 'recall': 0.8234741784037559, 'f1': 0.8031135531135531, 'number': 1065} 0.7215 0.7928 0.7554 0.8075

Framework versions

  • Transformers 4.40.0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1