File size: 6,063 Bytes
966f2c7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
license: cc-by-nc-sa-4.0
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: layoutlmv3-finetuned-Algo_22000Words
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlmv3-finetuned-Algo_22000Words

This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4487
- Precision: 0.8409
- Recall: 0.8268
- F1: 0.8338
- Accuracy: 0.8727

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 500

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log        | 0.03  | 10   | 1.8188          | 0.1983    | 0.1285 | 0.1559 | 0.3227   |
| No log        | 0.07  | 20   | 1.6701          | 0.1789    | 0.0950 | 0.1241 | 0.3364   |
| No log        | 0.1   | 30   | 1.5675          | 0.3028    | 0.1844 | 0.2292 | 0.4409   |
| No log        | 0.14  | 40   | 1.4618          | 0.3803    | 0.3017 | 0.3364 | 0.5182   |
| No log        | 0.17  | 50   | 1.3792          | 0.4228    | 0.3520 | 0.3841 | 0.5545   |
| No log        | 0.21  | 60   | 1.2919          | 0.4730    | 0.3911 | 0.4281 | 0.5727   |
| No log        | 0.24  | 70   | 1.2117          | 0.5513    | 0.4804 | 0.5134 | 0.6364   |
| No log        | 0.28  | 80   | 1.1297          | 0.6024    | 0.5587 | 0.5797 | 0.6909   |
| No log        | 0.31  | 90   | 1.0708          | 0.6176    | 0.5866 | 0.6017 | 0.6955   |
| No log        | 0.35  | 100  | 1.0096          | 0.6095    | 0.5754 | 0.5920 | 0.7091   |
| No log        | 0.38  | 110  | 0.9750          | 0.5818    | 0.5363 | 0.5581 | 0.6727   |
| No log        | 0.42  | 120  | 0.9478          | 0.5893    | 0.5531 | 0.5706 | 0.6818   |
| No log        | 0.45  | 130  | 0.8710          | 0.6494    | 0.6313 | 0.6402 | 0.7318   |
| No log        | 0.49  | 140  | 0.8509          | 0.6941    | 0.6592 | 0.6762 | 0.7636   |
| No log        | 0.52  | 150  | 0.8065          | 0.6959    | 0.6648 | 0.68   | 0.7682   |
| No log        | 0.56  | 160  | 0.7702          | 0.7341    | 0.7095 | 0.7216 | 0.7909   |
| No log        | 0.59  | 170  | 0.7290          | 0.7529    | 0.7318 | 0.7422 | 0.8045   |
| No log        | 0.63  | 180  | 0.7143          | 0.7414    | 0.7207 | 0.7309 | 0.7955   |
| No log        | 0.66  | 190  | 0.7161          | 0.7557    | 0.7430 | 0.7493 | 0.8091   |
| No log        | 0.7   | 200  | 0.6983          | 0.7443    | 0.7318 | 0.7380 | 0.8      |
| No log        | 0.73  | 210  | 0.6654          | 0.7771    | 0.7598 | 0.7684 | 0.8273   |
| No log        | 0.77  | 220  | 0.6355          | 0.7701    | 0.7486 | 0.7592 | 0.8273   |
| No log        | 0.8   | 230  | 0.6380          | 0.7746    | 0.7486 | 0.7614 | 0.8182   |
| No log        | 0.84  | 240  | 0.6313          | 0.7784    | 0.7654 | 0.7718 | 0.8273   |
| No log        | 0.87  | 250  | 0.6356          | 0.7797    | 0.7709 | 0.7753 | 0.8318   |
| No log        | 0.91  | 260  | 0.6125          | 0.7853    | 0.7765 | 0.7809 | 0.8273   |
| No log        | 0.94  | 270  | 0.6109          | 0.7943    | 0.7765 | 0.7853 | 0.8364   |
| No log        | 0.98  | 280  | 0.6087          | 0.7771    | 0.7598 | 0.7684 | 0.8273   |
| No log        | 1.01  | 290  | 0.5677          | 0.8103    | 0.7877 | 0.7989 | 0.8455   |
| No log        | 1.05  | 300  | 0.5542          | 0.8057    | 0.7877 | 0.7966 | 0.8409   |
| No log        | 1.08  | 310  | 0.5490          | 0.8125    | 0.7989 | 0.8056 | 0.85     |
| No log        | 1.11  | 320  | 0.5490          | 0.8023    | 0.7933 | 0.7978 | 0.8409   |
| No log        | 1.15  | 330  | 0.5632          | 0.7684    | 0.7598 | 0.7640 | 0.8091   |
| No log        | 1.18  | 340  | 0.5686          | 0.7809    | 0.7765 | 0.7787 | 0.8182   |
| No log        | 1.22  | 350  | 0.5302          | 0.7989    | 0.7989 | 0.7989 | 0.8364   |
| No log        | 1.25  | 360  | 0.5101          | 0.8146    | 0.8101 | 0.8123 | 0.8545   |
| No log        | 1.29  | 370  | 0.5180          | 0.8023    | 0.7933 | 0.7978 | 0.85     |
| No log        | 1.32  | 380  | 0.5088          | 0.7966    | 0.7877 | 0.7921 | 0.8409   |
| No log        | 1.36  | 390  | 0.4975          | 0.8136    | 0.8045 | 0.8090 | 0.8455   |
| No log        | 1.39  | 400  | 0.4914          | 0.8202    | 0.8156 | 0.8179 | 0.85     |
| No log        | 1.43  | 410  | 0.4870          | 0.8286    | 0.8101 | 0.8192 | 0.8545   |
| No log        | 1.46  | 420  | 0.4766          | 0.8171    | 0.7989 | 0.8079 | 0.8455   |
| No log        | 1.5   | 430  | 0.4723          | 0.8171    | 0.7989 | 0.8079 | 0.8455   |
| No log        | 1.53  | 440  | 0.4675          | 0.8171    | 0.7989 | 0.8079 | 0.8455   |
| No log        | 1.57  | 450  | 0.4579          | 0.8305    | 0.8212 | 0.8258 | 0.8636   |
| No log        | 1.6   | 460  | 0.4539          | 0.8362    | 0.8268 | 0.8315 | 0.8682   |
| No log        | 1.64  | 470  | 0.4505          | 0.8362    | 0.8268 | 0.8315 | 0.8682   |
| No log        | 1.67  | 480  | 0.4493          | 0.8514    | 0.8324 | 0.8418 | 0.8773   |
| No log        | 1.71  | 490  | 0.4489          | 0.8514    | 0.8324 | 0.8418 | 0.8773   |
| 0.82          | 1.74  | 500  | 0.4487          | 0.8409    | 0.8268 | 0.8338 | 0.8727   |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3