Edit model card

lilt-en-funsd

This model is a fine-tuned version of SCUT-DLVCLab/lilt-roberta-en-base on the funsd-layoutlmv3 dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4114
  • Answer: {'precision': 0.8497175141242937, 'recall': 0.9204406364749081, 'f1': 0.8836662749706228, 'number': 817}
  • Header: {'precision': 0.6534653465346535, 'recall': 0.5546218487394958, 'f1': 0.6000000000000001, 'number': 119}
  • Question: {'precision': 0.8935018050541517, 'recall': 0.9192200557103064, 'f1': 0.9061784897025171, 'number': 1077}
  • Overall Precision: 0.8634
  • Overall Recall: 0.8982
  • Overall F1: 0.8804
  • Overall Accuracy: 0.8253

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 2500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
0.575 5.26 200 0.8531 {'precision': 0.7890295358649789, 'recall': 0.9155446756425949, 'f1': 0.8475920679886686, 'number': 817} {'precision': 0.5416666666666666, 'recall': 0.3277310924369748, 'f1': 0.40837696335078527, 'number': 119} {'precision': 0.8611599297012302, 'recall': 0.9099350046425255, 'f1': 0.8848758465011286, 'number': 1077} 0.8188 0.8778 0.8473 0.7926
0.119 10.53 400 1.1026 {'precision': 0.8278688524590164, 'recall': 0.8653610771113831, 'f1': 0.846199880311191, 'number': 817} {'precision': 0.5813953488372093, 'recall': 0.42016806722689076, 'f1': 0.48780487804878053, 'number': 119} {'precision': 0.856655290102389, 'recall': 0.9322191272051996, 'f1': 0.8928412627834592, 'number': 1077} 0.8338 0.8748 0.8538 0.8124
0.0411 15.79 600 1.2238 {'precision': 0.8713942307692307, 'recall': 0.8873929008567931, 'f1': 0.8793208004851426, 'number': 817} {'precision': 0.552, 'recall': 0.5798319327731093, 'f1': 0.5655737704918032, 'number': 119} {'precision': 0.8669527896995708, 'recall': 0.9377901578458682, 'f1': 0.9009812667261373, 'number': 1077} 0.8501 0.8962 0.8726 0.8131
0.0186 21.05 800 1.2807 {'precision': 0.8607888631090487, 'recall': 0.9082007343941249, 'f1': 0.8838594401429422, 'number': 817} {'precision': 0.5447154471544715, 'recall': 0.5630252100840336, 'f1': 0.5537190082644629, 'number': 119} {'precision': 0.8921389396709324, 'recall': 0.9062209842154132, 'f1': 0.8991248272685398, 'number': 1077} 0.8586 0.8867 0.8724 0.8162
0.0098 26.32 1000 1.3494 {'precision': 0.852233676975945, 'recall': 0.9106487148102815, 'f1': 0.8804733727810652, 'number': 817} {'precision': 0.5511811023622047, 'recall': 0.5882352941176471, 'f1': 0.5691056910569106, 'number': 119} {'precision': 0.8794964028776978, 'recall': 0.9080779944289693, 'f1': 0.8935587026039287, 'number': 1077} 0.8485 0.8902 0.8688 0.8039
0.0068 31.58 1200 1.3878 {'precision': 0.8495475113122172, 'recall': 0.9192166462668299, 'f1': 0.8830099941211051, 'number': 817} {'precision': 0.5565217391304348, 'recall': 0.5378151260504201, 'f1': 0.547008547008547, 'number': 119} {'precision': 0.899624765478424, 'recall': 0.8904363974001857, 'f1': 0.8950069995333644, 'number': 1077} 0.8591 0.8813 0.8700 0.8140
0.0056 36.84 1400 1.4679 {'precision': 0.8338833883388339, 'recall': 0.9277845777233782, 'f1': 0.8783314020857474, 'number': 817} {'precision': 0.6442307692307693, 'recall': 0.5630252100840336, 'f1': 0.600896860986547, 'number': 119} {'precision': 0.8971000935453695, 'recall': 0.8904363974001857, 'f1': 0.8937558247903075, 'number': 1077} 0.8569 0.8862 0.8713 0.8117
0.0033 42.11 1600 1.3959 {'precision': 0.8463276836158192, 'recall': 0.9167686658506732, 'f1': 0.8801410105757932, 'number': 817} {'precision': 0.5833333333333334, 'recall': 0.5882352941176471, 'f1': 0.5857740585774059, 'number': 119} {'precision': 0.8939114391143912, 'recall': 0.8997214484679665, 'f1': 0.8968070337806571, 'number': 1077} 0.8559 0.8882 0.8718 0.8177
0.0013 47.37 1800 1.4114 {'precision': 0.8497175141242937, 'recall': 0.9204406364749081, 'f1': 0.8836662749706228, 'number': 817} {'precision': 0.6534653465346535, 'recall': 0.5546218487394958, 'f1': 0.6000000000000001, 'number': 119} {'precision': 0.8935018050541517, 'recall': 0.9192200557103064, 'f1': 0.9061784897025171, 'number': 1077} 0.8634 0.8982 0.8804 0.8253
0.001 52.63 2000 1.3795 {'precision': 0.8584795321637427, 'recall': 0.8984088127294981, 'f1': 0.8779904306220095, 'number': 817} {'precision': 0.6306306306306306, 'recall': 0.5882352941176471, 'f1': 0.6086956521739131, 'number': 119} {'precision': 0.8965201465201466, 'recall': 0.9090064995357474, 'f1': 0.9027201475334256, 'number': 1077} 0.8664 0.8857 0.8760 0.8339
0.0007 57.89 2200 1.4095 {'precision': 0.8586206896551725, 'recall': 0.9143206854345165, 'f1': 0.8855957320687612, 'number': 817} {'precision': 0.6055045871559633, 'recall': 0.5546218487394958, 'f1': 0.5789473684210525, 'number': 119} {'precision': 0.8887884267631103, 'recall': 0.9127205199628597, 'f1': 0.9005955107650022, 'number': 1077} 0.8614 0.8922 0.8765 0.8216
0.0006 63.16 2400 1.4001 {'precision': 0.8577981651376146, 'recall': 0.9155446756425949, 'f1': 0.8857312018946123, 'number': 817} {'precision': 0.6216216216216216, 'recall': 0.5798319327731093, 'f1': 0.6000000000000001, 'number': 119} {'precision': 0.895644283121597, 'recall': 0.9164345403899722, 'f1': 0.9059201468563561, 'number': 1077} 0.8652 0.8962 0.8804 0.8282

Framework versions

  • Transformers 4.28.1
  • Pytorch 1.13.0+cu117
  • Datasets 2.11.0
  • Tokenizers 0.13.2
Downloads last month
0