File size: 5,273 Bytes
c42794b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
---
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv3-base
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: layoutlmv3-finetuned-language-levels-v5-4000
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlmv3-finetuned-language-levels-v5-4000

This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0913
- Precision: 1.0
- Recall: 1.0
- F1: 1.0
- Accuracy: 0.9824

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 4000

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-------:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log        | 0.2618  | 100  | 0.8184          | 0.9647    | 0.9630 | 0.9639 | 0.6971   |
| No log        | 0.5236  | 200  | 0.6254          | 0.9743    | 0.9815 | 0.9779 | 0.7404   |
| No log        | 0.7853  | 300  | 0.4760          | 0.9926    | 0.9944 | 0.9935 | 0.7981   |
| No log        | 1.0471  | 400  | 0.3675          | 0.9798    | 0.9889 | 0.9843 | 0.8910   |
| 0.6763        | 1.3089  | 500  | 0.0913          | 1.0       | 1.0    | 1.0    | 0.9824   |
| 0.6763        | 1.5707  | 600  | 0.0375          | 1.0       | 1.0    | 1.0    | 0.9904   |
| 0.6763        | 1.8325  | 700  | 0.0149          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.6763        | 2.0942  | 800  | 0.0078          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.6763        | 2.3560  | 900  | 0.0047          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.037         | 2.6178  | 1000 | 0.0037          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.037         | 2.8796  | 1100 | 0.0030          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.037         | 3.1414  | 1200 | 0.0026          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.037         | 3.4031  | 1300 | 0.0022          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.037         | 3.6649  | 1400 | 0.0019          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0046        | 3.9267  | 1500 | 0.0017          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0046        | 4.1885  | 1600 | 0.0016          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0046        | 4.4503  | 1700 | 0.0014          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0046        | 4.7120  | 1800 | 0.0013          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0046        | 4.9738  | 1900 | 0.0012          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0025        | 5.2356  | 2000 | 0.0011          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0025        | 5.4974  | 2100 | 0.0010          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0025        | 5.7592  | 2200 | 0.0010          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0025        | 6.0209  | 2300 | 0.0009          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0025        | 6.2827  | 2400 | 0.0009          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0017        | 6.5445  | 2500 | 0.0008          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0017        | 6.8063  | 2600 | 0.0008          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0017        | 7.0681  | 2700 | 0.0007          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0017        | 7.3298  | 2800 | 0.0007          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0017        | 7.5916  | 2900 | 0.0007          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0013        | 7.8534  | 3000 | 0.0007          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0013        | 8.1152  | 3100 | 0.0006          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0013        | 8.3770  | 3200 | 0.0006          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0013        | 8.6387  | 3300 | 0.0006          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0013        | 8.9005  | 3400 | 0.0006          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0011        | 9.1623  | 3500 | 0.0006          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0011        | 9.4241  | 3600 | 0.0006          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0011        | 9.6859  | 3700 | 0.0006          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0011        | 9.9476  | 3800 | 0.0005          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.0011        | 10.2094 | 3900 | 0.0005          | 1.0       | 1.0    | 1.0    | 1.0      |
| 0.001         | 10.4712 | 4000 | 0.0005          | 1.0       | 1.0    | 1.0    | 1.0      |


### Framework versions

- Transformers 4.43.3
- Pytorch 2.1.0+cu118
- Datasets 2.20.0
- Tokenizers 0.19.1