End of training
Browse files
README.md
CHANGED
@@ -2,6 +2,8 @@
|
|
2 |
license: mit
|
3 |
tags:
|
4 |
- generated_from_trainer
|
|
|
|
|
5 |
model-index:
|
6 |
- name: lilt-en-funsd
|
7 |
results: []
|
@@ -12,7 +14,16 @@ should probably proofread and complete it, then remove this comment. -->
|
|
12 |
|
13 |
# lilt-en-funsd
|
14 |
|
15 |
-
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
|
17 |
## Model description
|
18 |
|
@@ -32,12 +43,30 @@ More information needed
|
|
32 |
|
33 |
The following hyperparameters were used during training:
|
34 |
- learning_rate: 5e-05
|
35 |
-
- train_batch_size:
|
36 |
-
- eval_batch_size:
|
37 |
- seed: 42
|
38 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
39 |
- lr_scheduler_type: linear
|
40 |
-
- training_steps:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
|
42 |
### Framework versions
|
43 |
|
|
|
2 |
license: mit
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
+
datasets:
|
6 |
+
- funsd-layoutlmv3
|
7 |
model-index:
|
8 |
- name: lilt-en-funsd
|
9 |
results: []
|
|
|
14 |
|
15 |
# lilt-en-funsd
|
16 |
|
17 |
+
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the funsd-layoutlmv3 dataset.
|
18 |
+
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 1.5275
|
20 |
+
- Answer: {'precision': 0.8656542056074766, 'recall': 0.9069767441860465, 'f1': 0.885833831440526, 'number': 817}
|
21 |
+
- Header: {'precision': 0.6050420168067226, 'recall': 0.6050420168067226, 'f1': 0.6050420168067226, 'number': 119}
|
22 |
+
- Question: {'precision': 0.8847184986595175, 'recall': 0.9192200557103064, 'f1': 0.9016393442622952, 'number': 1077}
|
23 |
+
- Overall Precision: 0.8610
|
24 |
+
- Overall Recall: 0.8957
|
25 |
+
- Overall F1: 0.8780
|
26 |
+
- Overall Accuracy: 0.8063
|
27 |
|
28 |
## Model description
|
29 |
|
|
|
43 |
|
44 |
The following hyperparameters were used during training:
|
45 |
- learning_rate: 5e-05
|
46 |
+
- train_batch_size: 8
|
47 |
+
- eval_batch_size: 8
|
48 |
- seed: 42
|
49 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
50 |
- lr_scheduler_type: linear
|
51 |
+
- training_steps: 2500
|
52 |
+
|
53 |
+
### Training results
|
54 |
+
|
55 |
+
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
56 |
+
|:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
57 |
+
| 0.4351 | 10.53 | 200 | 1.0793 | {'precision': 0.8371824480369515, 'recall': 0.8873929008567931, 'f1': 0.8615567439096851, 'number': 817} | {'precision': 0.5833333333333334, 'recall': 0.47058823529411764, 'f1': 0.5209302325581395, 'number': 119} | {'precision': 0.8660071942446043, 'recall': 0.8941504178272981, 'f1': 0.8798538145271813, 'number': 1077} | 0.8409 | 0.8664 | 0.8534 | 0.7890 |
|
58 |
+
| 0.0438 | 21.05 | 400 | 1.4722 | {'precision': 0.8228822882288229, 'recall': 0.9155446756425949, 'f1': 0.8667439165701043, 'number': 817} | {'precision': 0.5916666666666667, 'recall': 0.5966386554621849, 'f1': 0.5941422594142259, 'number': 119} | {'precision': 0.9014218009478673, 'recall': 0.883008356545961, 'f1': 0.8921200750469044, 'number': 1077} | 0.8493 | 0.8793 | 0.8640 | 0.7809 |
|
59 |
+
| 0.0138 | 31.58 | 600 | 1.7220 | {'precision': 0.8554641598119859, 'recall': 0.8910648714810282, 'f1': 0.8729016786570742, 'number': 817} | {'precision': 0.5138888888888888, 'recall': 0.6218487394957983, 'f1': 0.5627376425855513, 'number': 119} | {'precision': 0.8821396192203083, 'recall': 0.903435468895079, 'f1': 0.8926605504587157, 'number': 1077} | 0.8460 | 0.8818 | 0.8635 | 0.7762 |
|
60 |
+
| 0.0085 | 42.11 | 800 | 1.5618 | {'precision': 0.8459796149490374, 'recall': 0.9143206854345165, 'f1': 0.8788235294117647, 'number': 817} | {'precision': 0.6078431372549019, 'recall': 0.5210084033613446, 'f1': 0.5610859728506787, 'number': 119} | {'precision': 0.8905109489051095, 'recall': 0.9062209842154132, 'f1': 0.8982972848596411, 'number': 1077} | 0.8578 | 0.8867 | 0.8720 | 0.7965 |
|
61 |
+
| 0.004 | 52.63 | 1000 | 1.5275 | {'precision': 0.8656542056074766, 'recall': 0.9069767441860465, 'f1': 0.885833831440526, 'number': 817} | {'precision': 0.6050420168067226, 'recall': 0.6050420168067226, 'f1': 0.6050420168067226, 'number': 119} | {'precision': 0.8847184986595175, 'recall': 0.9192200557103064, 'f1': 0.9016393442622952, 'number': 1077} | 0.8610 | 0.8957 | 0.8780 | 0.8063 |
|
62 |
+
| 0.0029 | 63.16 | 1200 | 1.6222 | {'precision': 0.8576349024110218, 'recall': 0.9143206854345165, 'f1': 0.8850710900473934, 'number': 817} | {'precision': 0.6565656565656566, 'recall': 0.5462184873949579, 'f1': 0.5963302752293578, 'number': 119} | {'precision': 0.8782452999104745, 'recall': 0.9108635097493036, 'f1': 0.8942570647219691, 'number': 1077} | 0.8591 | 0.8907 | 0.8746 | 0.7879 |
|
63 |
+
| 0.0018 | 73.68 | 1400 | 1.7916 | {'precision': 0.8514285714285714, 'recall': 0.9118727050183598, 'f1': 0.8806146572104018, 'number': 817} | {'precision': 0.5390625, 'recall': 0.5798319327731093, 'f1': 0.5587044534412955, 'number': 119} | {'precision': 0.8845096241979835, 'recall': 0.8960074280408542, 'f1': 0.890221402214022, 'number': 1077} | 0.8496 | 0.8838 | 0.8663 | 0.7738 |
|
64 |
+
| 0.0013 | 84.21 | 1600 | 1.9358 | {'precision': 0.8530751708428246, 'recall': 0.9167686658506732, 'f1': 0.8837758112094395, 'number': 817} | {'precision': 0.6263736263736264, 'recall': 0.4789915966386555, 'f1': 0.5428571428571428, 'number': 119} | {'precision': 0.8936363636363637, 'recall': 0.9127205199628597, 'f1': 0.9030776297657327, 'number': 1077} | 0.8647 | 0.8887 | 0.8765 | 0.7762 |
|
65 |
+
| 0.0006 | 94.74 | 1800 | 1.9653 | {'precision': 0.8605990783410138, 'recall': 0.9143206854345165, 'f1': 0.886646884272997, 'number': 817} | {'precision': 0.5818181818181818, 'recall': 0.5378151260504201, 'f1': 0.5589519650655022, 'number': 119} | {'precision': 0.8934802571166207, 'recall': 0.903435468895079, 'f1': 0.8984302862419206, 'number': 1077} | 0.8631 | 0.8862 | 0.8745 | 0.7717 |
|
66 |
+
| 0.0005 | 105.26 | 2000 | 1.9637 | {'precision': 0.8527397260273972, 'recall': 0.9143206854345165, 'f1': 0.8824571766095688, 'number': 817} | {'precision': 0.5619047619047619, 'recall': 0.4957983193277311, 'f1': 0.5267857142857143, 'number': 119} | {'precision': 0.8894927536231884, 'recall': 0.9117920148560817, 'f1': 0.9005043558000918, 'number': 1077} | 0.8576 | 0.8882 | 0.8726 | 0.7686 |
|
67 |
+
| 0.0003 | 115.79 | 2200 | 1.8611 | {'precision': 0.8609501738122828, 'recall': 0.9094247246022031, 'f1': 0.8845238095238096, 'number': 817} | {'precision': 0.5929203539823009, 'recall': 0.5630252100840336, 'f1': 0.5775862068965517, 'number': 119} | {'precision': 0.8886861313868614, 'recall': 0.904363974001857, 'f1': 0.8964565117349288, 'number': 1077} | 0.8610 | 0.8862 | 0.8734 | 0.7839 |
|
68 |
+
| 0.0002 | 126.32 | 2400 | 1.9031 | {'precision': 0.860919540229885, 'recall': 0.9167686658506732, 'f1': 0.887966804979253, 'number': 817} | {'precision': 0.5887850467289719, 'recall': 0.5294117647058824, 'f1': 0.5575221238938053, 'number': 119} | {'precision': 0.8902104300091491, 'recall': 0.903435468895079, 'f1': 0.8967741935483872, 'number': 1077} | 0.8623 | 0.8867 | 0.8744 | 0.7805 |
|
69 |
+
|
70 |
|
71 |
### Framework versions
|
72 |
|
logs/events.out.tfevents.1687178788.74e720cf39c8.1604.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f45d80cbf68e15b2dd07af3f555e09c0329815d996a51b996b61b7199269ae06
|
3 |
+
size 12694
|
logs/events.out.tfevents.1687180628.74e720cf39c8.1604.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0bdc71c78c2000a552dacfee9d99b25ec7a4587169709617a9b2ec3dfa516bda
|
3 |
+
size 592
|
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 520821201
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:75ee978fc70fc4513cffb53d047b1d748b7147af4b8690678688b1e801934a94
|
3 |
size 520821201
|