File size: 7,814 Bytes
380704c
 
 
 
b0f1a2b
 
380704c
 
 
 
 
 
 
 
 
 
b0f1a2b
 
6741db9
 
 
 
 
 
 
 
380704c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b0f1a2b
 
380704c
 
 
b0f1a2b
 
 
 
6741db9
 
 
 
 
 
 
 
 
 
 
 
 
 
b0f1a2b
380704c
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
---
license: mit
tags:
- generated_from_trainer
datasets:
- funsd-layoutlmv3
model-index:
- name: lilt-en-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# lilt-en-funsd

This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the funsd-layoutlmv3 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8784
- Answer: {'precision': 0.8651817116060961, 'recall': 0.9033047735618115, 'f1': 0.8838323353293414, 'number': 817}
- Header: {'precision': 0.6504854368932039, 'recall': 0.5630252100840336, 'f1': 0.6036036036036037, 'number': 119}
- Question: {'precision': 0.9073394495412844, 'recall': 0.9182915506035283, 'f1': 0.912782648823258, 'number': 1077}
- Overall Precision: 0.8768
- Overall Recall: 0.8912
- Overall F1: 0.8840
- Overall Accuracy: 0.7948

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 2500

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Answer                                                                                                   | Header                                                                                                   | Question                                                                                                  | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.4369        | 10.53  | 200  | 0.9022          | {'precision': 0.8049065420560748, 'recall': 0.8433292533659731, 'f1': 0.8236700537955769, 'number': 817} | {'precision': 0.5317460317460317, 'recall': 0.5630252100840336, 'f1': 0.5469387755102041, 'number': 119} | {'precision': 0.8837420526793823, 'recall': 0.903435468895079, 'f1': 0.8934802571166208, 'number': 1077}  | 0.8301            | 0.8589         | 0.8442     | 0.7888           |
| 0.047         | 21.05  | 400  | 1.3222          | {'precision': 0.8382526564344747, 'recall': 0.8690330477356181, 'f1': 0.8533653846153846, 'number': 817} | {'precision': 0.5447761194029851, 'recall': 0.6134453781512605, 'f1': 0.5770750988142292, 'number': 119} | {'precision': 0.8667866786678667, 'recall': 0.8941504178272981, 'f1': 0.8802559414990858, 'number': 1077} | 0.8346            | 0.8674         | 0.8507     | 0.7837           |
| 0.015         | 31.58  | 600  | 1.4745          | {'precision': 0.8549528301886793, 'recall': 0.8873929008567931, 'f1': 0.8708708708708709, 'number': 817} | {'precision': 0.5867768595041323, 'recall': 0.5966386554621849, 'f1': 0.5916666666666667, 'number': 119} | {'precision': 0.8755635707844905, 'recall': 0.9015784586815228, 'f1': 0.888380603842635, 'number': 1077}  | 0.8503            | 0.8778         | 0.8638     | 0.7969           |
| 0.0051        | 42.11  | 800  | 1.5719          | {'precision': 0.8768472906403941, 'recall': 0.8714810281517748, 'f1': 0.8741559238796808, 'number': 817} | {'precision': 0.5736434108527132, 'recall': 0.6218487394957983, 'f1': 0.596774193548387, 'number': 119}  | {'precision': 0.8794326241134752, 'recall': 0.9210770659238626, 'f1': 0.8997732426303855, 'number': 1077} | 0.8594            | 0.8833         | 0.8711     | 0.7923           |
| 0.0041        | 52.63  | 1000 | 1.6771          | {'precision': 0.8352402745995423, 'recall': 0.8935128518971848, 'f1': 0.8633944411590775, 'number': 817} | {'precision': 0.6568627450980392, 'recall': 0.5630252100840336, 'f1': 0.6063348416289592, 'number': 119} | {'precision': 0.8865116279069768, 'recall': 0.8848653667595172, 'f1': 0.8856877323420075, 'number': 1077} | 0.8532            | 0.8693         | 0.8612     | 0.7877           |
| 0.0039        | 63.16  | 1200 | 1.6064          | {'precision': 0.8609112709832134, 'recall': 0.8788249694002448, 'f1': 0.8697758933979407, 'number': 817} | {'precision': 0.6106194690265486, 'recall': 0.5798319327731093, 'f1': 0.5948275862068966, 'number': 119} | {'precision': 0.8897777777777778, 'recall': 0.9294336118848654, 'f1': 0.9091734786557675, 'number': 1077} | 0.8629            | 0.8882         | 0.8754     | 0.8009           |
| 0.0019        | 73.68  | 1400 | 1.7674          | {'precision': 0.8533178114086146, 'recall': 0.8971848225214198, 'f1': 0.8747016706443913, 'number': 817} | {'precision': 0.5769230769230769, 'recall': 0.5042016806722689, 'f1': 0.5381165919282511, 'number': 119} | {'precision': 0.8842676311030742, 'recall': 0.9080779944289693, 'f1': 0.8960146587265231, 'number': 1077} | 0.8560            | 0.8798         | 0.8677     | 0.7981           |
| 0.0007        | 84.21  | 1600 | 1.8380          | {'precision': 0.8469387755102041, 'recall': 0.9143206854345165, 'f1': 0.8793407886992348, 'number': 817} | {'precision': 0.6017699115044248, 'recall': 0.5714285714285714, 'f1': 0.5862068965517241, 'number': 119} | {'precision': 0.8931159420289855, 'recall': 0.9155060352831941, 'f1': 0.9041723979825768, 'number': 1077} | 0.8580            | 0.8947         | 0.8760     | 0.7931           |
| 0.0007        | 94.74  | 1800 | 1.8108          | {'precision': 0.8600478468899522, 'recall': 0.8800489596083231, 'f1': 0.8699334543254689, 'number': 817} | {'precision': 0.6435643564356436, 'recall': 0.5462184873949579, 'f1': 0.5909090909090908, 'number': 119} | {'precision': 0.8722849695916595, 'recall': 0.9322191272051996, 'f1': 0.9012567324955117, 'number': 1077} | 0.8563            | 0.8882         | 0.8720     | 0.7887           |
| 0.0004        | 105.26 | 2000 | 1.9035          | {'precision': 0.8627906976744186, 'recall': 0.9082007343941249, 'f1': 0.8849135360763267, 'number': 817} | {'precision': 0.6285714285714286, 'recall': 0.5546218487394958, 'f1': 0.5892857142857143, 'number': 119} | {'precision': 0.8955495004541326, 'recall': 0.9155060352831941, 'f1': 0.9054178145087237, 'number': 1077} | 0.8683            | 0.8912         | 0.8796     | 0.7965           |
| 0.0002        | 115.79 | 2200 | 1.8784          | {'precision': 0.8651817116060961, 'recall': 0.9033047735618115, 'f1': 0.8838323353293414, 'number': 817} | {'precision': 0.6504854368932039, 'recall': 0.5630252100840336, 'f1': 0.6036036036036037, 'number': 119} | {'precision': 0.9073394495412844, 'recall': 0.9182915506035283, 'f1': 0.912782648823258, 'number': 1077}  | 0.8768            | 0.8912         | 0.8840     | 0.7948           |
| 0.0002        | 126.32 | 2400 | 1.9075          | {'precision': 0.8640093786635404, 'recall': 0.9020807833537332, 'f1': 0.8826347305389222, 'number': 817} | {'precision': 0.6296296296296297, 'recall': 0.5714285714285714, 'f1': 0.5991189427312775, 'number': 119} | {'precision': 0.9041970802919708, 'recall': 0.9201485608170845, 'f1': 0.9121030832949838, 'number': 1077} | 0.8731            | 0.8922         | 0.8826     | 0.7959           |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3