layoutlm-funsd / README.md
Desh8114's picture
End of training
f0ca86e
metadata
license: mit
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6888
  • Answer: {'precision': 0.6959826275787188, 'recall': 0.792336217552534, 'f1': 0.7410404624277457, 'number': 809}
  • Header: {'precision': 0.3629032258064516, 'recall': 0.37815126050420167, 'f1': 0.37037037037037035, 'number': 119}
  • Question: {'precision': 0.7736185383244206, 'recall': 0.8150234741784037, 'f1': 0.7937814357567444, 'number': 1065}
  • Overall Precision: 0.7171
  • Overall Recall: 0.7797
  • Overall F1: 0.7471
  • Overall Accuracy: 0.8084

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.8101 1.0 10 1.5789 {'precision': 0.01434878587196468, 'recall': 0.016069221260815822, 'f1': 0.015160349854227406, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.1038107752956636, 'recall': 0.07417840375586854, 'f1': 0.08652792990142387, 'number': 1065} 0.0552 0.0462 0.0503 0.3845
1.4764 2.0 20 1.2528 {'precision': 0.16216216216216217, 'recall': 0.14833127317676142, 'f1': 0.15493867010974824, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.452970297029703, 'recall': 0.5154929577464789, 'f1': 0.48221343873517786, 'number': 1065} 0.3427 0.3357 0.3392 0.5948
1.106 3.0 30 0.9703 {'precision': 0.49557522123893805, 'recall': 0.553770086526576, 'f1': 0.5230589608873321, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.6458527493010252, 'recall': 0.6507042253521127, 'f1': 0.6482694106641721, 'number': 1065} 0.5679 0.5725 0.5702 0.7117
0.8412 4.0 40 0.7859 {'precision': 0.6176165803108808, 'recall': 0.7367119901112484, 'f1': 0.6719278466741826, 'number': 809} {'precision': 0.19642857142857142, 'recall': 0.09243697478991597, 'f1': 0.12571428571428572, 'number': 119} {'precision': 0.7102272727272727, 'recall': 0.704225352112676, 'f1': 0.7072135785007072, 'number': 1065} 0.6533 0.6809 0.6668 0.7606
0.6772 5.0 50 0.7168 {'precision': 0.6395582329317269, 'recall': 0.7873918417799752, 'f1': 0.7058171745152354, 'number': 809} {'precision': 0.17475728155339806, 'recall': 0.15126050420168066, 'f1': 0.16216216216216217, 'number': 119} {'precision': 0.730072463768116, 'recall': 0.7568075117370892, 'f1': 0.7431996311664361, 'number': 1065} 0.6632 0.7331 0.6964 0.7834
0.571 6.0 60 0.6881 {'precision': 0.6596638655462185, 'recall': 0.7762669962917181, 'f1': 0.7132311186825667, 'number': 809} {'precision': 0.2345679012345679, 'recall': 0.15966386554621848, 'f1': 0.18999999999999997, 'number': 119} {'precision': 0.7076923076923077, 'recall': 0.8206572769953052, 'f1': 0.7600000000000001, 'number': 1065} 0.6706 0.7632 0.7139 0.7930
0.5021 7.0 70 0.6724 {'precision': 0.6694736842105263, 'recall': 0.7861557478368356, 'f1': 0.7231381466742467, 'number': 809} {'precision': 0.2542372881355932, 'recall': 0.25210084033613445, 'f1': 0.25316455696202533, 'number': 119} {'precision': 0.7303754266211604, 'recall': 0.8037558685446009, 'f1': 0.765310683951721, 'number': 1065} 0.6795 0.7637 0.7191 0.7968
0.454 8.0 80 0.6567 {'precision': 0.6835306781485468, 'recall': 0.7849196538936959, 'f1': 0.7307249712313003, 'number': 809} {'precision': 0.35051546391752575, 'recall': 0.2857142857142857, 'f1': 0.3148148148148148, 'number': 119} {'precision': 0.7604259094942325, 'recall': 0.8046948356807512, 'f1': 0.781934306569343, 'number': 1065} 0.7088 0.7657 0.7361 0.8040
0.4011 9.0 90 0.6651 {'precision': 0.6748140276301806, 'recall': 0.7849196538936959, 'f1': 0.7257142857142858, 'number': 809} {'precision': 0.30158730158730157, 'recall': 0.31932773109243695, 'f1': 0.310204081632653, 'number': 119} {'precision': 0.7592592592592593, 'recall': 0.8084507042253521, 'f1': 0.7830832196452934, 'number': 1065} 0.6970 0.7697 0.7315 0.8006
0.3604 10.0 100 0.6693 {'precision': 0.6716259298618491, 'recall': 0.7812113720642769, 'f1': 0.7222857142857143, 'number': 809} {'precision': 0.32432432432432434, 'recall': 0.3025210084033613, 'f1': 0.31304347826086953, 'number': 119} {'precision': 0.7441077441077442, 'recall': 0.8300469483568075, 'f1': 0.7847314691522416, 'number': 1065} 0.6929 0.7787 0.7333 0.7999
0.3269 11.0 110 0.6750 {'precision': 0.6823027718550106, 'recall': 0.7911001236093943, 'f1': 0.7326846021751574, 'number': 809} {'precision': 0.3783783783783784, 'recall': 0.35294117647058826, 'f1': 0.3652173913043478, 'number': 119} {'precision': 0.7705357142857143, 'recall': 0.8103286384976526, 'f1': 0.7899313501144164, 'number': 1065} 0.7123 0.7752 0.7424 0.8068
0.3069 12.0 120 0.6782 {'precision': 0.6866310160427808, 'recall': 0.7935723114956736, 'f1': 0.7362385321100916, 'number': 809} {'precision': 0.3865546218487395, 'recall': 0.3865546218487395, 'f1': 0.38655462184873957, 'number': 119} {'precision': 0.7771739130434783, 'recall': 0.8056338028169014, 'f1': 0.7911479944674966, 'number': 1065} 0.7164 0.7757 0.7449 0.8062
0.293 13.0 130 0.6901 {'precision': 0.6992316136114161, 'recall': 0.7873918417799752, 'f1': 0.7406976744186047, 'number': 809} {'precision': 0.3983050847457627, 'recall': 0.3949579831932773, 'f1': 0.39662447257383965, 'number': 119} {'precision': 0.775089605734767, 'recall': 0.812206572769953, 'f1': 0.7932141219624025, 'number': 1065} 0.7221 0.7772 0.7487 0.8057
0.2775 14.0 140 0.6842 {'precision': 0.6945337620578779, 'recall': 0.8009888751545118, 'f1': 0.7439724454649829, 'number': 809} {'precision': 0.36363636363636365, 'recall': 0.3697478991596639, 'f1': 0.3666666666666667, 'number': 119} {'precision': 0.7723214285714286, 'recall': 0.812206572769953, 'f1': 0.7917620137299771, 'number': 1065} 0.7162 0.7812 0.7473 0.8068
0.2724 15.0 150 0.6888 {'precision': 0.6959826275787188, 'recall': 0.792336217552534, 'f1': 0.7410404624277457, 'number': 809} {'precision': 0.3629032258064516, 'recall': 0.37815126050420167, 'f1': 0.37037037037037035, 'number': 119} {'precision': 0.7736185383244206, 'recall': 0.8150234741784037, 'f1': 0.7937814357567444, 'number': 1065} 0.7171 0.7797 0.7471 0.8084

Framework versions

  • Transformers 4.30.0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.13.3