File size: 10,609 Bytes
8139046
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv3-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: 2024-01-10_one_stage_subgraphs_weighted_txt_vis_conc_1_4_8_12_ramp
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# 2024-01-10_one_stage_subgraphs_weighted_txt_vis_conc_1_4_8_12_ramp

This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2492
- Accuracy: 0.7825
- Exit 0 Accuracy: 0.2825
- Exit 1 Accuracy: 0.48
- Exit 2 Accuracy: 0.6675
- Exit 3 Accuracy: 0.7675
- Exit 4 Accuracy: 0.7825

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 24
- total_train_batch_size: 48
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Exit 0 Accuracy | Exit 1 Accuracy | Exit 2 Accuracy | Exit 3 Accuracy | Exit 4 Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|
| No log        | 0.96  | 16   | 2.6704          | 0.1675   | 0.065           | 0.0875          | 0.0625          | 0.0625          | 0.105           |
| No log        | 1.98  | 33   | 2.4843          | 0.2425   | 0.1075          | 0.1425          | 0.0625          | 0.0625          | 0.17            |
| No log        | 3.0   | 50   | 2.3121          | 0.295    | 0.1325          | 0.1375          | 0.0625          | 0.0625          | 0.2575          |
| No log        | 3.96  | 66   | 2.0499          | 0.4025   | 0.145           | 0.11            | 0.0625          | 0.0625          | 0.31            |
| No log        | 4.98  | 83   | 1.7740          | 0.5425   | 0.14            | 0.1225          | 0.0625          | 0.0625          | 0.4175          |
| No log        | 6.0   | 100  | 1.4803          | 0.6225   | 0.1525          | 0.1125          | 0.0625          | 0.0625          | 0.5075          |
| No log        | 6.96  | 116  | 1.3264          | 0.6625   | 0.1625          | 0.1375          | 0.0675          | 0.0625          | 0.5775          |
| No log        | 7.98  | 133  | 1.1949          | 0.705    | 0.1725          | 0.13            | 0.0775          | 0.0625          | 0.6225          |
| No log        | 9.0   | 150  | 1.0490          | 0.73     | 0.1675          | 0.15            | 0.1175          | 0.0625          | 0.6875          |
| No log        | 9.96  | 166  | 0.9819          | 0.7375   | 0.185           | 0.1825          | 0.1225          | 0.0625          | 0.685           |
| No log        | 10.98 | 183  | 0.9539          | 0.74     | 0.1825          | 0.18            | 0.155           | 0.0625          | 0.695           |
| No log        | 12.0  | 200  | 0.8850          | 0.7675   | 0.195           | 0.2275          | 0.1925          | 0.07            | 0.7475          |
| No log        | 12.96 | 216  | 0.8869          | 0.75     | 0.1925          | 0.225           | 0.3             | 0.1125          | 0.75            |
| No log        | 13.98 | 233  | 0.9250          | 0.7475   | 0.2025          | 0.255           | 0.325           | 0.12            | 0.75            |
| No log        | 15.0  | 250  | 0.8685          | 0.7875   | 0.215           | 0.18            | 0.315           | 0.14            | 0.78            |
| No log        | 15.96 | 266  | 0.8504          | 0.7875   | 0.2375          | 0.2225          | 0.405           | 0.405           | 0.79            |
| No log        | 16.98 | 283  | 0.9215          | 0.7725   | 0.235           | 0.1975          | 0.355           | 0.5075          | 0.775           |
| No log        | 18.0  | 300  | 0.9816          | 0.7575   | 0.2575          | 0.2325          | 0.405           | 0.5825          | 0.7575          |
| No log        | 18.96 | 316  | 0.9900          | 0.755    | 0.255           | 0.2075          | 0.4525          | 0.5925          | 0.7675          |
| No log        | 19.98 | 333  | 0.9651          | 0.78     | 0.255           | 0.245           | 0.485           | 0.595           | 0.7825          |
| No log        | 21.0  | 350  | 1.0235          | 0.7525   | 0.24            | 0.3             | 0.465           | 0.6825          | 0.7625          |
| No log        | 21.96 | 366  | 1.0137          | 0.7875   | 0.24            | 0.345           | 0.4775          | 0.7275          | 0.785           |
| No log        | 22.98 | 383  | 1.0876          | 0.765    | 0.235           | 0.345           | 0.4675          | 0.74            | 0.765           |
| No log        | 24.0  | 400  | 1.0696          | 0.77     | 0.2525          | 0.3025          | 0.52            | 0.755           | 0.775           |
| No log        | 24.96 | 416  | 1.0440          | 0.775    | 0.2525          | 0.285           | 0.49            | 0.77            | 0.7725          |
| No log        | 25.98 | 433  | 1.0962          | 0.76     | 0.255           | 0.2825          | 0.4975          | 0.775           | 0.7625          |
| No log        | 27.0  | 450  | 1.1214          | 0.7725   | 0.275           | 0.3525          | 0.515           | 0.7775          | 0.7725          |
| No log        | 27.96 | 466  | 1.1593          | 0.7775   | 0.27            | 0.325           | 0.52            | 0.7625          | 0.7775          |
| No log        | 28.98 | 483  | 1.1341          | 0.7625   | 0.2725          | 0.3925          | 0.545           | 0.7625          | 0.7625          |
| 0.5624        | 30.0  | 500  | 1.1682          | 0.7675   | 0.2775          | 0.415           | 0.5875          | 0.76            | 0.77            |
| 0.5624        | 30.96 | 516  | 1.1978          | 0.7725   | 0.26            | 0.415           | 0.585           | 0.7675          | 0.77            |
| 0.5624        | 31.98 | 533  | 1.1051          | 0.78     | 0.26            | 0.4275          | 0.59            | 0.7775          | 0.78            |
| 0.5624        | 33.0  | 550  | 1.0934          | 0.78     | 0.25            | 0.41            | 0.5825          | 0.775           | 0.7825          |
| 0.5624        | 33.96 | 566  | 1.1564          | 0.7825   | 0.26            | 0.395           | 0.5925          | 0.7775          | 0.78            |
| 0.5624        | 34.98 | 583  | 1.1605          | 0.7825   | 0.2775          | 0.4425          | 0.615           | 0.77            | 0.785           |
| 0.5624        | 36.0  | 600  | 1.1793          | 0.775    | 0.2825          | 0.4325          | 0.6             | 0.7775          | 0.775           |
| 0.5624        | 36.96 | 616  | 1.1635          | 0.785    | 0.29            | 0.4375          | 0.61            | 0.7625          | 0.7825          |
| 0.5624        | 37.98 | 633  | 1.1591          | 0.775    | 0.28            | 0.4375          | 0.615           | 0.765           | 0.7775          |
| 0.5624        | 39.0  | 650  | 1.1568          | 0.7775   | 0.2875          | 0.455           | 0.625           | 0.765           | 0.7775          |
| 0.5624        | 39.96 | 666  | 1.1686          | 0.78     | 0.2825          | 0.4375          | 0.6275          | 0.7675          | 0.78            |
| 0.5624        | 40.98 | 683  | 1.1720          | 0.785    | 0.275           | 0.45            | 0.6275          | 0.77            | 0.785           |
| 0.5624        | 42.0  | 700  | 1.1977          | 0.785    | 0.2775          | 0.4425          | 0.6375          | 0.76            | 0.785           |
| 0.5624        | 42.96 | 716  | 1.2252          | 0.7825   | 0.275           | 0.4575          | 0.6325          | 0.755           | 0.785           |
| 0.5624        | 43.98 | 733  | 1.2122          | 0.79     | 0.28            | 0.4625          | 0.64            | 0.76            | 0.79            |
| 0.5624        | 45.0  | 750  | 1.2193          | 0.78     | 0.2875          | 0.4625          | 0.6525          | 0.7675          | 0.775           |
| 0.5624        | 45.96 | 766  | 1.2197          | 0.7825   | 0.285           | 0.46            | 0.66            | 0.755           | 0.7775          |
| 0.5624        | 46.98 | 783  | 1.1791          | 0.785    | 0.2825          | 0.47            | 0.6475          | 0.7625          | 0.785           |
| 0.5624        | 48.0  | 800  | 1.1879          | 0.79     | 0.2825          | 0.47            | 0.655           | 0.7625          | 0.7925          |
| 0.5624        | 48.96 | 816  | 1.1847          | 0.795    | 0.285           | 0.4725          | 0.6525          | 0.7625          | 0.7975          |
| 0.5624        | 49.98 | 833  | 1.1964          | 0.7925   | 0.2825          | 0.48            | 0.665           | 0.765           | 0.785           |
| 0.5624        | 51.0  | 850  | 1.2254          | 0.7825   | 0.285           | 0.47            | 0.665           | 0.7675          | 0.7825          |
| 0.5624        | 51.96 | 866  | 1.2455          | 0.7875   | 0.285           | 0.4775          | 0.665           | 0.7625          | 0.785           |
| 0.5624        | 52.98 | 883  | 1.2492          | 0.7875   | 0.2825          | 0.48            | 0.6675          | 0.7625          | 0.7875          |
| 0.5624        | 54.0  | 900  | 1.2459          | 0.785    | 0.285           | 0.47            | 0.67            | 0.77            | 0.785           |
| 0.5624        | 54.96 | 916  | 1.2453          | 0.7825   | 0.2825          | 0.475           | 0.665           | 0.7675          | 0.7825          |
| 0.5624        | 55.98 | 933  | 1.2505          | 0.785    | 0.28            | 0.4775          | 0.665           | 0.7625          | 0.785           |
| 0.5624        | 57.0  | 950  | 1.2494          | 0.7825   | 0.2825          | 0.48            | 0.6675          | 0.765           | 0.7825          |
| 0.5624        | 57.6  | 960  | 1.2492          | 0.7825   | 0.2825          | 0.48            | 0.6675          | 0.7675          | 0.7825          |


### Framework versions

- Transformers 4.31.0
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3