File size: 15,340 Bytes
4f95321
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
---
license: cc-by-nc-sa-4.0
tags:
- generated_from_trainer
model-index:
- name: all_final_layoutlmv3-base-ner
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# all_final_layoutlmv3-base-ner

This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4992
- Footer: {'precision': 0.9660942316160281, 'recall': 0.962280701754386, 'f1': 0.9641836958910129, 'number': 2280}
- Header: {'precision': 0.8500527983104541, 'recall': 0.8464773922187171, 'f1': 0.8482613277133825, 'number': 951}
- Able: {'precision': 0.6867305061559508, 'recall': 0.820932134096484, 'f1': 0.7478584729981377, 'number': 1223}
- Aption: {'precision': 0.8540609137055838, 'recall': 0.8157575757575758, 'f1': 0.8344699318040918, 'number': 825}
- Ext: {'precision': 0.7446111869031378, 'recall': 0.7724313614491933, 'f1': 0.7582661850514032, 'number': 3533}
- Icture: {'precision': 0.5221238938053098, 'recall': 0.5822368421052632, 'f1': 0.5505443234836703, 'number': 608}
- Itle: {'precision': 0.6068376068376068, 'recall': 0.5966386554621849, 'f1': 0.6016949152542374, 'number': 119}
- Ootnote: {'precision': 0.8503401360544217, 'recall': 0.8620689655172413, 'f1': 0.8561643835616437, 'number': 145}
- Ormula: {'precision': 0.8461538461538461, 'recall': 0.9472222222222222, 'f1': 0.8938401048492791, 'number': 360}
- Overall Precision: 0.7918
- Overall Recall: 0.8260
- Overall F1: 0.8085
- Overall Accuracy: 0.9414

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Footer                                                                                                    | Header                                                                                                   | Able                                                                                                      | Aption                                                                                                   | Ext                                                                                                       | Icture                                                                                                   | Itle                                                                                                      | Ootnote                                                                                                   | Ormula                                                                                                   | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.1878        | 1.0   | 6851  | 0.3842          | {'precision': 0.9633318243549117, 'recall': 0.9333333333333333, 'f1': 0.948095344174649, 'number': 2280}  | {'precision': 0.7953296703296703, 'recall': 0.6088328075709779, 'f1': 0.6896962477665276, 'number': 951} | {'precision': 0.4947542794036444, 'recall': 0.7326246933769419, 'f1': 0.5906394199077125, 'number': 1223} | {'precision': 0.8165517241379311, 'recall': 0.7175757575757575, 'f1': 0.7638709677419355, 'number': 825} | {'precision': 0.6012542461458061, 'recall': 0.6512878573450326, 'f1': 0.6252717391304349, 'number': 3533} | {'precision': 0.4029363784665579, 'recall': 0.40625, 'f1': 0.40458640458640466, 'number': 608}           | {'precision': 0.7555555555555555, 'recall': 0.2857142857142857, 'f1': 0.41463414634146345, 'number': 119} | {'precision': 0.5217391304347826, 'recall': 0.41379310344827586, 'f1': 0.4615384615384615, 'number': 145} | {'precision': 0.6575963718820862, 'recall': 0.8055555555555556, 'f1': 0.7240948813982523, 'number': 360} | 0.6779            | 0.7096         | 0.6934     | 0.9274           |
| 0.0984        | 2.0   | 13702 | 0.2675          | {'precision': 0.9493287137288869, 'recall': 0.9614035087719298, 'f1': 0.9553279581608194, 'number': 2280} | {'precision': 0.8330058939096268, 'recall': 0.8916929547844374, 'f1': 0.8613509395632302, 'number': 951} | {'precision': 0.5959723096286973, 'recall': 0.7743254292722813, 'f1': 0.6735419630156473, 'number': 1223} | {'precision': 0.8536585365853658, 'recall': 0.7636363636363637, 'f1': 0.8061420345489444, 'number': 825} | {'precision': 0.6965589155370178, 'recall': 0.7562977639399944, 'f1': 0.725200162844348, 'number': 3533}  | {'precision': 0.4162371134020619, 'recall': 0.53125, 'f1': 0.4667630057803468, 'number': 608}            | {'precision': 0.8028169014084507, 'recall': 0.4789915966386555, 'f1': 0.6000000000000001, 'number': 119}  | {'precision': 0.8055555555555556, 'recall': 0.8, 'f1': 0.8027681660899654, 'number': 145}                 | {'precision': 0.8200514138817481, 'recall': 0.8861111111111111, 'f1': 0.8518024032042723, 'number': 360} | 0.7455            | 0.8068         | 0.7750     | 0.9415           |
| 0.0734        | 3.0   | 20553 | 0.3427          | {'precision': 0.9673289183222958, 'recall': 0.9609649122807018, 'f1': 0.9641364136413642, 'number': 2280} | {'precision': 0.8236434108527132, 'recall': 0.8937960042060988, 'f1': 0.8572869389813415, 'number': 951} | {'precision': 0.6429503916449086, 'recall': 0.8053965658217498, 'f1': 0.7150635208711434, 'number': 1223} | {'precision': 0.8019441069258809, 'recall': 0.8, 'f1': 0.8009708737864076, 'number': 825}                | {'precision': 0.7178800856531049, 'recall': 0.7591282196433626, 'f1': 0.7379281881964507, 'number': 3533} | {'precision': 0.4699853587115666, 'recall': 0.5279605263157895, 'f1': 0.4972889233152594, 'number': 608} | {'precision': 0.6941176470588235, 'recall': 0.4957983193277311, 'f1': 0.5784313725490197, 'number': 119}  | {'precision': 0.8467153284671532, 'recall': 0.8, 'f1': 0.8226950354609929, 'number': 145}                 | {'precision': 0.7995110024449877, 'recall': 0.9083333333333333, 'f1': 0.8504551365409623, 'number': 360} | 0.7654            | 0.8155         | 0.7896     | 0.9398           |
| 0.0562        | 4.0   | 27404 | 0.3282          | {'precision': 0.9607493309545049, 'recall': 0.9447368421052632, 'f1': 0.9526758071649712, 'number': 2280} | {'precision': 0.8263598326359832, 'recall': 0.8307045215562566, 'f1': 0.8285264813843733, 'number': 951} | {'precision': 0.6615074024226111, 'recall': 0.803761242845462, 'f1': 0.7257290513104466, 'number': 1223}  | {'precision': 0.7624861265260822, 'recall': 0.8327272727272728, 'f1': 0.7960602549246814, 'number': 825} | {'precision': 0.7137466307277628, 'recall': 0.7495046702519106, 'f1': 0.7311887339500207, 'number': 3533} | {'precision': 0.4671875, 'recall': 0.4917763157894737, 'f1': 0.47916666666666663, 'number': 608}         | {'precision': 0.5688073394495413, 'recall': 0.5210084033613446, 'f1': 0.543859649122807, 'number': 119}   | {'precision': 0.8907563025210085, 'recall': 0.7310344827586207, 'f1': 0.8030303030303031, 'number': 145}  | {'precision': 0.8196286472148541, 'recall': 0.8583333333333333, 'f1': 0.8385345997286294, 'number': 360} | 0.7626            | 0.8003         | 0.7810     | 0.9399           |
| 0.0413        | 5.0   | 34255 | 0.3534          | {'precision': 0.9620978404583517, 'recall': 0.9574561403508772, 'f1': 0.9597713783249064, 'number': 2280} | {'precision': 0.8567041965199591, 'recall': 0.8801261829652997, 'f1': 0.8682572614107884, 'number': 951} | {'precision': 0.6488095238095238, 'recall': 0.8021259198691741, 'f1': 0.7173674588665446, 'number': 1223} | {'precision': 0.8273736128236745, 'recall': 0.8133333333333334, 'f1': 0.8202933985330073, 'number': 825} | {'precision': 0.7264, 'recall': 0.7710161335975092, 'f1': 0.7480433887134423, 'number': 3533}             | {'precision': 0.4946401225114854, 'recall': 0.53125, 'f1': 0.5122918318794607, 'number': 608}            | {'precision': 0.5426356589147286, 'recall': 0.5882352941176471, 'f1': 0.5645161290322581, 'number': 119}  | {'precision': 0.7677419354838709, 'recall': 0.8206896551724138, 'f1': 0.7933333333333332, 'number': 145}  | {'precision': 0.8656330749354005, 'recall': 0.9305555555555556, 'f1': 0.896921017402945, 'number': 360}  | 0.7745            | 0.8207         | 0.7969     | 0.9442           |
| 0.0304        | 6.0   | 41106 | 0.4328          | {'precision': 0.9694960212201591, 'recall': 0.9618421052631579, 'f1': 0.965653896961691, 'number': 2280}  | {'precision': 0.8734939759036144, 'recall': 0.9148264984227129, 'f1': 0.8936825885978429, 'number': 951} | {'precision': 0.6484018264840182, 'recall': 0.812755519215045, 'f1': 0.7213352685050799, 'number': 1223}  | {'precision': 0.8534370946822308, 'recall': 0.7975757575757576, 'f1': 0.8245614035087719, 'number': 825} | {'precision': 0.7543478260869565, 'recall': 0.785734503255024, 'f1': 0.7697213364758074, 'number': 3533}  | {'precision': 0.48439821693907875, 'recall': 0.5361842105263158, 'f1': 0.508977361436378, 'number': 608} | {'precision': 0.6111111111111112, 'recall': 0.5546218487394958, 'f1': 0.5814977973568282, 'number': 119}  | {'precision': 0.8872180451127819, 'recall': 0.8137931034482758, 'f1': 0.8489208633093526, 'number': 145}  | {'precision': 0.8829787234042553, 'recall': 0.9222222222222223, 'f1': 0.9021739130434783, 'number': 360} | 0.7912            | 0.8296         | 0.8100     | 0.9388           |
| 0.0199        | 7.0   | 47957 | 0.3676          | {'precision': 0.9600347523892268, 'recall': 0.9692982456140351, 'f1': 0.9646442601484069, 'number': 2280} | {'precision': 0.8231441048034934, 'recall': 0.7928496319663512, 'f1': 0.8077129084092126, 'number': 951} | {'precision': 0.6872852233676976, 'recall': 0.8176614881439084, 'f1': 0.7468259895444361, 'number': 1223} | {'precision': 0.7914252607184241, 'recall': 0.8278787878787879, 'f1': 0.8092417061611374, 'number': 825} | {'precision': 0.7416034669555797, 'recall': 0.7749787715822247, 'f1': 0.757923875432526, 'number': 3533}  | {'precision': 0.4984375, 'recall': 0.524671052631579, 'f1': 0.5112179487179488, 'number': 608}           | {'precision': 0.7777777777777778, 'recall': 0.5882352941176471, 'f1': 0.6698564593301436, 'number': 119}  | {'precision': 0.8823529411764706, 'recall': 0.8275862068965517, 'f1': 0.8540925266903915, 'number': 145}  | {'precision': 0.8871391076115486, 'recall': 0.9388888888888889, 'f1': 0.9122807017543859, 'number': 360} | 0.7859            | 0.8196         | 0.8024     | 0.9438           |
| 0.0132        | 8.0   | 54808 | 0.4376          | {'precision': 0.96, 'recall': 0.9578947368421052, 'f1': 0.9589462129527992, 'number': 2280}               | {'precision': 0.8481404958677686, 'recall': 0.8633017875920084, 'f1': 0.8556539864512768, 'number': 951} | {'precision': 0.6822880771881461, 'recall': 0.8094848732624693, 'f1': 0.7404637247569185, 'number': 1223} | {'precision': 0.8098086124401914, 'recall': 0.8206060606060606, 'f1': 0.8151715833835039, 'number': 825} | {'precision': 0.7253596164091636, 'recall': 0.7707330880271723, 'f1': 0.7473583093179635, 'number': 3533} | {'precision': 0.5070422535211268, 'recall': 0.5328947368421053, 'f1': 0.5196471531676022, 'number': 608} | {'precision': 0.6194690265486725, 'recall': 0.5882352941176471, 'f1': 0.603448275862069, 'number': 119}   | {'precision': 0.8145695364238411, 'recall': 0.8482758620689655, 'f1': 0.8310810810810811, 'number': 145}  | {'precision': 0.8596491228070176, 'recall': 0.9527777777777777, 'f1': 0.9038208168642952, 'number': 360} | 0.7798            | 0.8219         | 0.8003     | 0.9414           |
| 0.0084        | 9.0   | 61659 | 0.4624          | {'precision': 0.9685562444641276, 'recall': 0.9592105263157895, 'f1': 0.9638607315998237, 'number': 2280} | {'precision': 0.8442714126807565, 'recall': 0.7981072555205048, 'f1': 0.8205405405405405, 'number': 951} | {'precision': 0.6707152496626181, 'recall': 0.812755519215045, 'f1': 0.7349353049907579, 'number': 1223}  | {'precision': 0.8192771084337349, 'recall': 0.8242424242424242, 'f1': 0.8217522658610271, 'number': 825} | {'precision': 0.7267348036578806, 'recall': 0.764789131050099, 'f1': 0.7452765135843331, 'number': 3533}  | {'precision': 0.5121212121212121, 'recall': 0.555921052631579, 'f1': 0.5331230283911672, 'number': 608}  | {'precision': 0.7070707070707071, 'recall': 0.5882352941176471, 'f1': 0.6422018348623852, 'number': 119}  | {'precision': 0.8541666666666666, 'recall': 0.8482758620689655, 'f1': 0.8512110726643598, 'number': 145}  | {'precision': 0.868020304568528, 'recall': 0.95, 'f1': 0.9071618037135278, 'number': 360}                | 0.7817            | 0.8159         | 0.7984     | 0.9408           |
| 0.0053        | 10.0  | 68510 | 0.4992          | {'precision': 0.9660942316160281, 'recall': 0.962280701754386, 'f1': 0.9641836958910129, 'number': 2280}  | {'precision': 0.8500527983104541, 'recall': 0.8464773922187171, 'f1': 0.8482613277133825, 'number': 951} | {'precision': 0.6867305061559508, 'recall': 0.820932134096484, 'f1': 0.7478584729981377, 'number': 1223}  | {'precision': 0.8540609137055838, 'recall': 0.8157575757575758, 'f1': 0.8344699318040918, 'number': 825} | {'precision': 0.7446111869031378, 'recall': 0.7724313614491933, 'f1': 0.7582661850514032, 'number': 3533} | {'precision': 0.5221238938053098, 'recall': 0.5822368421052632, 'f1': 0.5505443234836703, 'number': 608} | {'precision': 0.6068376068376068, 'recall': 0.5966386554621849, 'f1': 0.6016949152542374, 'number': 119}  | {'precision': 0.8503401360544217, 'recall': 0.8620689655172413, 'f1': 0.8561643835616437, 'number': 145}  | {'precision': 0.8461538461538461, 'recall': 0.9472222222222222, 'f1': 0.8938401048492791, 'number': 360} | 0.7918            | 0.8260         | 0.8085     | 0.9414           |


### Framework versions

- Transformers 4.26.0
- Pytorch 1.12.1
- Datasets 2.9.0
- Tokenizers 0.13.2