File size: 8,872 Bytes
aa8f6da
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c3dac84
 
 
 
 
 
 
 
 
 
 
 
 
 
aa8f6da
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d88d405
c3dac84
aa8f6da
 
 
d88d405
aa8f6da
 
 
c3dac84
 
 
 
 
 
 
aa8f6da
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
---
license: cc-by-nc-sa-4.0
tags:
- generated_from_trainer
model-index:
- name: layoutlmv3-base-ner
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlmv3-base-ner

This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5413
- Footer: {'precision': 0.9381835473133618, 'recall': 0.8653508771929824, 'f1': 0.9002966005019393, 'number': 2280}
- Header: {'precision': 0.6690058479532164, 'recall': 0.601472134595163, 'f1': 0.6334440753045404, 'number': 951}
- Able: {'precision': 0.19949254678084363, 'recall': 0.5143090760425184, 'f1': 0.2874771480804387, 'number': 1223}
- Aption: {'precision': 0.32124352331606215, 'recall': 0.07515151515151515, 'f1': 0.12180746561886051, 'number': 825}
- Ext: {'precision': 0.34080531340805315, 'recall': 0.4647608264930654, 'f1': 0.39324631780625074, 'number': 3533}
- Icture: {'precision': 0.0546448087431694, 'recall': 0.13157894736842105, 'f1': 0.0772200772200772, 'number': 608}
- Itle: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}
- Ootnote: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 145}
- Ormula: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 360}
- Overall Precision: 0.3939
- Overall Recall: 0.4936
- Overall F1: 0.4382
- Overall Accuracy: 0.7180

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step | Validation Loss | Footer                                                                                                    | Header                                                                                                   | Able                                                                                                        | Aption                                                                                                      | Ext                                                                                                         | Icture                                                                                                         | Itle                                                        | Ootnote                                                     | Ormula                                                      | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------:|:-----------------------------------------------------------:|:-----------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.4772        | 1.0   | 500  | 1.3891          | {'precision': 0.8112648221343873, 'recall': 0.7201754385964912, 'f1': 0.763011152416357, 'number': 2280}  | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 951}                                              | {'precision': 0.13366666666666666, 'recall': 0.32788225674570726, 'f1': 0.1899123845607388, 'number': 1223} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 825}                                                 | {'precision': 0.2245841035120148, 'recall': 0.27512029436739316, 'f1': 0.24729678157995166, 'number': 3533} | {'precision': 0.022222222222222223, 'recall': 0.008223684210526315, 'f1': 0.012004801920768306, 'number': 608} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 145} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 360} | 0.3131            | 0.3007         | 0.3068     | 0.7042           |
| 0.2598        | 2.0   | 1000 | 1.3046          | {'precision': 0.672650475184794, 'recall': 0.8381578947368421, 'f1': 0.7463386057410663, 'number': 2280}  | {'precision': 0.28655597214783074, 'recall': 0.562565720294427, 'f1': 0.3797019162526614, 'number': 951} | {'precision': 0.12042429284525791, 'recall': 0.473426001635323, 'f1': 0.19200795887912453, 'number': 1223}  | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 825}                                                 | {'precision': 0.22616279069767442, 'recall': 0.4404189074440985, 'f1': 0.29885719773360225, 'number': 3533} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 608}                                                    | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 145} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 360} | 0.2682            | 0.4561         | 0.3378     | 0.6730           |
| 0.1936        | 3.0   | 1500 | 1.4208          | {'precision': 0.9038104089219331, 'recall': 0.8530701754385965, 'f1': 0.8777075812274369, 'number': 2280} | {'precision': 0.6213468869123253, 'recall': 0.5141955835962145, 'f1': 0.5627157652474108, 'number': 951} | {'precision': 0.16486261448792672, 'recall': 0.4856909239574816, 'f1': 0.2461665975963531, 'number': 1223}  | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 825}                                                 | {'precision': 0.2463847702727439, 'recall': 0.3809793376733654, 'f1': 0.2992441084926634, 'number': 3533}   | {'precision': 0.02721774193548387, 'recall': 0.044407894736842105, 'f1': 0.03375, 'number': 608}               | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 145} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 360} | 0.3385            | 0.4382         | 0.3819     | 0.7043           |
| 0.1392        | 4.0   | 2000 | 1.7208          | {'precision': 0.9363636363636364, 'recall': 0.8583333333333333, 'f1': 0.8956521739130435, 'number': 2280} | {'precision': 0.6706521739130434, 'recall': 0.6487907465825447, 'f1': 0.6595403527525386, 'number': 951} | {'precision': 0.15699904122722916, 'recall': 0.53556827473426, 'f1': 0.24281742354031513, 'number': 1223}   | {'precision': 0.18992248062015504, 'recall': 0.059393939393939395, 'f1': 0.0904893813481071, 'number': 825} | {'precision': 0.2668534407284188, 'recall': 0.43136144919332015, 'f1': 0.3297273907399394, 'number': 3533}  | {'precision': 0.046700507614213196, 'recall': 0.0756578947368421, 'f1': 0.05775266792215945, 'number': 608}    | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 145} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 360} | 0.3430            | 0.4827         | 0.4010     | 0.6703           |
| 0.0899        | 5.0   | 2500 | 1.5413          | {'precision': 0.9381835473133618, 'recall': 0.8653508771929824, 'f1': 0.9002966005019393, 'number': 2280} | {'precision': 0.6690058479532164, 'recall': 0.601472134595163, 'f1': 0.6334440753045404, 'number': 951}  | {'precision': 0.19949254678084363, 'recall': 0.5143090760425184, 'f1': 0.2874771480804387, 'number': 1223}  | {'precision': 0.32124352331606215, 'recall': 0.07515151515151515, 'f1': 0.12180746561886051, 'number': 825} | {'precision': 0.34080531340805315, 'recall': 0.4647608264930654, 'f1': 0.39324631780625074, 'number': 3533} | {'precision': 0.0546448087431694, 'recall': 0.13157894736842105, 'f1': 0.0772200772200772, 'number': 608}      | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 145} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 360} | 0.3939            | 0.4936         | 0.4382     | 0.7180           |


### Framework versions

- Transformers 4.26.0
- Pytorch 1.12.1
- Datasets 2.9.0
- Tokenizers 0.13.2