|
--- |
|
license: mit |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- funsd-layoutlmv3 |
|
model-index: |
|
- name: layoutxlm |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# layoutxlm |
|
|
|
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the funsd-layoutlmv3 dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 1.5889 |
|
- Answer: {'precision': 0.8761904761904762, 'recall': 0.9008567931456548, 'f1': 0.8883524441762222, 'number': 817} |
|
- Header: {'precision': 0.6666666666666666, 'recall': 0.5546218487394958, 'f1': 0.6055045871559633, 'number': 119} |
|
- Question: {'precision': 0.8883968113374667, 'recall': 0.9312906220984215, 'f1': 0.9093381686310064, 'number': 1077} |
|
- Overall Precision: 0.8728 |
|
- Overall Recall: 0.8967 |
|
- Overall F1: 0.8846 |
|
- Overall Accuracy: 0.8115 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 8 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- training_steps: 2500 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |
|
|:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:| |
|
| 0.4322 | 10.53 | 200 | 0.9083 | {'precision': 0.7704569606801275, 'recall': 0.8873929008567931, 'f1': 0.8248009101251422, 'number': 817} | {'precision': 0.6162790697674418, 'recall': 0.44537815126050423, 'f1': 0.5170731707317073, 'number': 119} | {'precision': 0.866852886405959, 'recall': 0.8644382544103992, 'f1': 0.8656438865643886, 'number': 1077} | 0.8134 | 0.8490 | 0.8308 | 0.7863 | |
|
| 0.0467 | 21.05 | 400 | 1.2942 | {'precision': 0.8496583143507973, 'recall': 0.9130966952264382, 'f1': 0.88023598820059, 'number': 817} | {'precision': 0.6585365853658537, 'recall': 0.453781512605042, 'f1': 0.5373134328358209, 'number': 119} | {'precision': 0.8859964093357271, 'recall': 0.9164345403899722, 'f1': 0.9009584664536742, 'number': 1077} | 0.8616 | 0.8877 | 0.8745 | 0.7966 | |
|
| 0.015 | 31.58 | 600 | 1.2662 | {'precision': 0.8574739281575898, 'recall': 0.9057527539779682, 'f1': 0.880952380952381, 'number': 817} | {'precision': 0.5304347826086957, 'recall': 0.5126050420168067, 'f1': 0.5213675213675214, 'number': 119} | {'precision': 0.8793879387938794, 'recall': 0.9071494893221913, 'f1': 0.8930530164533822, 'number': 1077} | 0.8511 | 0.8833 | 0.8669 | 0.8114 | |
|
| 0.0081 | 42.11 | 800 | 1.5223 | {'precision': 0.8710462287104623, 'recall': 0.8763769889840881, 'f1': 0.8737034777303235, 'number': 817} | {'precision': 0.5882352941176471, 'recall': 0.5882352941176471, 'f1': 0.5882352941176471, 'number': 119} | {'precision': 0.8885844748858448, 'recall': 0.903435468895079, 'f1': 0.8959484346224678, 'number': 1077} | 0.8639 | 0.8738 | 0.8689 | 0.8041 | |
|
| 0.0033 | 52.63 | 1000 | 1.4361 | {'precision': 0.8502304147465438, 'recall': 0.9033047735618115, 'f1': 0.8759643916913946, 'number': 817} | {'precision': 0.6144578313253012, 'recall': 0.42857142857142855, 'f1': 0.504950495049505, 'number': 119} | {'precision': 0.8767605633802817, 'recall': 0.924791086350975, 'f1': 0.9001355625847266, 'number': 1077} | 0.8553 | 0.8867 | 0.8707 | 0.8156 | |
|
| 0.0026 | 63.16 | 1200 | 1.4994 | {'precision': 0.8615560640732265, 'recall': 0.9216646266829865, 'f1': 0.8905972797161442, 'number': 817} | {'precision': 0.5981308411214953, 'recall': 0.5378151260504201, 'f1': 0.5663716814159291, 'number': 119} | {'precision': 0.8945454545454545, 'recall': 0.9136490250696379, 'f1': 0.9039963252181902, 'number': 1077} | 0.8654 | 0.8947 | 0.8798 | 0.8208 | |
|
| 0.0016 | 73.68 | 1400 | 1.6091 | {'precision': 0.858139534883721, 'recall': 0.9033047735618115, 'f1': 0.8801431127012522, 'number': 817} | {'precision': 0.5980392156862745, 'recall': 0.5126050420168067, 'f1': 0.5520361990950226, 'number': 119} | {'precision': 0.8947849954254345, 'recall': 0.9080779944289693, 'f1': 0.9013824884792625, 'number': 1077} | 0.8647 | 0.8828 | 0.8736 | 0.8167 | |
|
| 0.0009 | 84.21 | 1600 | 1.6010 | {'precision': 0.859122401847575, 'recall': 0.9106487148102815, 'f1': 0.8841354723707664, 'number': 817} | {'precision': 0.6741573033707865, 'recall': 0.5042016806722689, 'f1': 0.576923076923077, 'number': 119} | {'precision': 0.8882931188561215, 'recall': 0.9229340761374187, 'f1': 0.9052823315118397, 'number': 1077} | 0.8669 | 0.8932 | 0.8799 | 0.8049 | |
|
| 0.0006 | 94.74 | 1800 | 1.5889 | {'precision': 0.8761904761904762, 'recall': 0.9008567931456548, 'f1': 0.8883524441762222, 'number': 817} | {'precision': 0.6666666666666666, 'recall': 0.5546218487394958, 'f1': 0.6055045871559633, 'number': 119} | {'precision': 0.8883968113374667, 'recall': 0.9312906220984215, 'f1': 0.9093381686310064, 'number': 1077} | 0.8728 | 0.8967 | 0.8846 | 0.8115 | |
|
| 0.0004 | 105.26 | 2000 | 1.6126 | {'precision': 0.8634772462077013, 'recall': 0.9057527539779682, 'f1': 0.8841099163679809, 'number': 817} | {'precision': 0.6538461538461539, 'recall': 0.5714285714285714, 'f1': 0.6098654708520179, 'number': 119} | {'precision': 0.894404332129964, 'recall': 0.9201485608170845, 'f1': 0.9070938215102976, 'number': 1077} | 0.8695 | 0.8937 | 0.8814 | 0.8127 | |
|
| 0.0004 | 115.79 | 2200 | 1.6606 | {'precision': 0.8403648802736602, 'recall': 0.9020807833537332, 'f1': 0.8701298701298701, 'number': 817} | {'precision': 0.6509433962264151, 'recall': 0.5798319327731093, 'f1': 0.6133333333333333, 'number': 119} | {'precision': 0.8884826325411335, 'recall': 0.9025069637883009, 'f1': 0.8954398894518655, 'number': 1077} | 0.8560 | 0.8833 | 0.8694 | 0.7906 | |
|
| 0.0002 | 126.32 | 2400 | 1.6619 | {'precision': 0.8378684807256236, 'recall': 0.9045287637698899, 'f1': 0.8699234844025897, 'number': 817} | {'precision': 0.6836734693877551, 'recall': 0.5630252100840336, 'f1': 0.6175115207373272, 'number': 119} | {'precision': 0.881981981981982, 'recall': 0.9090064995357474, 'f1': 0.8952903520804755, 'number': 1077} | 0.8541 | 0.8867 | 0.8701 | 0.7929 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.30.2 |
|
- Pytorch 2.1.0.dev20230523+cu117 |
|
- Datasets 2.13.0 |
|
- Tokenizers 0.13.3 |
|
|