jordyvl commited on
Commit
f4d5a3c
1 Parent(s): 09db8e5

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +133 -0
README.md ADDED
@@ -0,0 +1,133 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-sa-4.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: 2024-01-03_one_stage_subgraphs_weighted_txt_vis_conc_1_4_8_12_ramp
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # 2024-01-03_one_stage_subgraphs_weighted_txt_vis_conc_1_4_8_12_ramp
16
+
17
+ This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.8574
20
+ - Accuracy: 0.77
21
+ - Exit 0 Accuracy: 0.2
22
+ - Exit 1 Accuracy: 0.3
23
+ - Exit 2 Accuracy: 0.1125
24
+ - Exit 3 Accuracy: 0.2725
25
+ - Exit 4 Accuracy: 0.2675
26
+ - Exit 5 Accuracy: 0.51
27
+ - Exit 6 Accuracy: 0.55
28
+ - Exit 7 Accuracy: 0.63
29
+ - Exit 8 Accuracy: 0.525
30
+ - Exit 9 Accuracy: 0.3425
31
+ - Exit 10 Accuracy: 0.445
32
+ - Exit 11 Accuracy: 0.6875
33
+ - Exit 12 Accuracy: 0.7575
34
+
35
+ ## Model description
36
+
37
+ More information needed
38
+
39
+ ## Intended uses & limitations
40
+
41
+ More information needed
42
+
43
+ ## Training and evaluation data
44
+
45
+ More information needed
46
+
47
+ ## Training procedure
48
+
49
+ ### Training hyperparameters
50
+
51
+ The following hyperparameters were used during training:
52
+ - learning_rate: 2e-05
53
+ - train_batch_size: 8
54
+ - eval_batch_size: 4
55
+ - seed: 42
56
+ - gradient_accumulation_steps: 24
57
+ - total_train_batch_size: 192
58
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
+ - lr_scheduler_type: linear
60
+ - num_epochs: 60
61
+
62
+ ### Training results
63
+
64
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Exit 0 Accuracy | Exit 1 Accuracy | Exit 2 Accuracy | Exit 3 Accuracy | Exit 4 Accuracy | Exit 5 Accuracy | Exit 6 Accuracy | Exit 7 Accuracy | Exit 8 Accuracy | Exit 9 Accuracy | Exit 10 Accuracy | Exit 11 Accuracy | Exit 12 Accuracy |
65
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|:----------------:|:----------------:|:----------------:|
66
+ | No log | 0.96 | 4 | 2.7477 | 0.1025 | 0.07 | 0.0625 | 0.0625 | 0.0625 | 0.06 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.06 |
67
+ | No log | 1.96 | 8 | 2.7020 | 0.1275 | 0.0775 | 0.0625 | 0.0625 | 0.0625 | 0.0575 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.06 |
68
+ | No log | 2.96 | 12 | 2.6514 | 0.19 | 0.08 | 0.0625 | 0.0625 | 0.0625 | 0.065 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.065 |
69
+ | No log | 3.96 | 16 | 2.5720 | 0.215 | 0.095 | 0.065 | 0.0625 | 0.0625 | 0.06 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0925 |
70
+ | No log | 4.96 | 20 | 2.5033 | 0.2375 | 0.1 | 0.0675 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.1475 |
71
+ | No log | 5.96 | 24 | 2.4003 | 0.275 | 0.115 | 0.08 | 0.0625 | 0.0625 | 0.0625 | 0.0775 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.1525 |
72
+ | No log | 6.96 | 28 | 2.3192 | 0.3 | 0.12 | 0.0875 | 0.0625 | 0.0625 | 0.0625 | 0.09 | 0.075 | 0.065 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.1625 |
73
+ | No log | 7.96 | 32 | 2.2199 | 0.3325 | 0.1325 | 0.0875 | 0.0625 | 0.0625 | 0.0625 | 0.1025 | 0.075 | 0.065 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.185 |
74
+ | No log | 8.96 | 36 | 2.1335 | 0.36 | 0.1425 | 0.1025 | 0.0625 | 0.0625 | 0.0625 | 0.1375 | 0.09 | 0.07 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.235 |
75
+ | No log | 9.96 | 40 | 2.0290 | 0.4 | 0.1475 | 0.1025 | 0.0625 | 0.0625 | 0.06 | 0.15 | 0.0925 | 0.0725 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.27 |
76
+ | No log | 10.96 | 44 | 1.9285 | 0.4525 | 0.155 | 0.1025 | 0.0625 | 0.065 | 0.0675 | 0.1725 | 0.1225 | 0.08 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.305 |
77
+ | No log | 11.96 | 48 | 1.8071 | 0.495 | 0.155 | 0.1075 | 0.0625 | 0.0775 | 0.0825 | 0.2025 | 0.135 | 0.0925 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.355 |
78
+ | No log | 12.96 | 52 | 1.7194 | 0.5375 | 0.16 | 0.1075 | 0.0625 | 0.0775 | 0.0875 | 0.22 | 0.1625 | 0.1025 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.37 |
79
+ | No log | 13.96 | 56 | 1.5732 | 0.6 | 0.1625 | 0.11 | 0.0625 | 0.08 | 0.0775 | 0.2425 | 0.2125 | 0.14 | 0.0625 | 0.0625 | 0.0625 | 0.0625 | 0.43 |
80
+ | No log | 14.96 | 60 | 1.5141 | 0.6125 | 0.16 | 0.115 | 0.0625 | 0.0825 | 0.09 | 0.29 | 0.245 | 0.18 | 0.0625 | 0.065 | 0.0625 | 0.0625 | 0.4325 |
81
+ | No log | 15.96 | 64 | 1.4049 | 0.65 | 0.1575 | 0.1175 | 0.0625 | 0.0925 | 0.1 | 0.3275 | 0.2725 | 0.24 | 0.0675 | 0.065 | 0.0625 | 0.0625 | 0.51 |
82
+ | No log | 16.96 | 68 | 1.3476 | 0.665 | 0.16 | 0.115 | 0.0625 | 0.095 | 0.1025 | 0.345 | 0.285 | 0.2625 | 0.0675 | 0.075 | 0.0625 | 0.0625 | 0.5275 |
83
+ | No log | 17.96 | 72 | 1.2825 | 0.6925 | 0.16 | 0.1175 | 0.0625 | 0.1025 | 0.11 | 0.35 | 0.2875 | 0.2925 | 0.0675 | 0.075 | 0.065 | 0.0625 | 0.55 |
84
+ | No log | 18.96 | 76 | 1.2102 | 0.71 | 0.16 | 0.1175 | 0.0625 | 0.1025 | 0.135 | 0.365 | 0.29 | 0.31 | 0.0675 | 0.0775 | 0.08 | 0.0625 | 0.5925 |
85
+ | No log | 19.96 | 80 | 1.1664 | 0.725 | 0.16 | 0.1175 | 0.0625 | 0.1125 | 0.1325 | 0.3675 | 0.3075 | 0.365 | 0.0775 | 0.0775 | 0.08 | 0.0625 | 0.6025 |
86
+ | No log | 20.96 | 84 | 1.1363 | 0.735 | 0.16 | 0.12 | 0.0625 | 0.115 | 0.145 | 0.3725 | 0.3275 | 0.3775 | 0.0775 | 0.075 | 0.07 | 0.0625 | 0.6175 |
87
+ | No log | 21.96 | 88 | 1.0745 | 0.74 | 0.16 | 0.1225 | 0.0625 | 0.1175 | 0.1325 | 0.375 | 0.355 | 0.4175 | 0.0775 | 0.0825 | 0.065 | 0.0625 | 0.6275 |
88
+ | No log | 22.96 | 92 | 1.0377 | 0.7525 | 0.16 | 0.13 | 0.0625 | 0.115 | 0.1575 | 0.38 | 0.3775 | 0.4125 | 0.0825 | 0.075 | 0.07 | 0.0625 | 0.64 |
89
+ | No log | 23.96 | 96 | 1.0321 | 0.74 | 0.16 | 0.1375 | 0.0625 | 0.115 | 0.165 | 0.3825 | 0.385 | 0.4375 | 0.08 | 0.0975 | 0.07 | 0.0625 | 0.66 |
90
+ | No log | 24.96 | 100 | 0.9702 | 0.76 | 0.1625 | 0.15 | 0.0625 | 0.11 | 0.1725 | 0.385 | 0.4075 | 0.455 | 0.0825 | 0.0975 | 0.0725 | 0.0625 | 0.68 |
91
+ | No log | 25.96 | 104 | 0.9861 | 0.7525 | 0.1675 | 0.1525 | 0.0625 | 0.1125 | 0.1825 | 0.395 | 0.4 | 0.4675 | 0.0825 | 0.115 | 0.0675 | 0.07 | 0.6825 |
92
+ | No log | 26.96 | 108 | 0.9339 | 0.7525 | 0.16 | 0.15 | 0.0625 | 0.1175 | 0.195 | 0.4075 | 0.4275 | 0.4825 | 0.0875 | 0.13 | 0.07 | 0.095 | 0.7025 |
93
+ | No log | 27.96 | 112 | 0.9362 | 0.7575 | 0.1625 | 0.1425 | 0.0625 | 0.1175 | 0.1925 | 0.4175 | 0.43 | 0.515 | 0.095 | 0.14 | 0.0675 | 0.105 | 0.71 |
94
+ | No log | 28.96 | 116 | 0.8872 | 0.755 | 0.165 | 0.15 | 0.0625 | 0.1175 | 0.2 | 0.4275 | 0.4325 | 0.53 | 0.095 | 0.1425 | 0.07 | 0.125 | 0.7425 |
95
+ | No log | 29.96 | 120 | 0.8939 | 0.7675 | 0.1625 | 0.15 | 0.0625 | 0.1175 | 0.2 | 0.4475 | 0.4325 | 0.55 | 0.1 | 0.1425 | 0.085 | 0.1325 | 0.7175 |
96
+ | No log | 30.96 | 124 | 0.8767 | 0.7475 | 0.16 | 0.1575 | 0.0625 | 0.12 | 0.195 | 0.4425 | 0.4475 | 0.545 | 0.1 | 0.1525 | 0.0925 | 0.2025 | 0.7575 |
97
+ | No log | 31.96 | 128 | 0.8658 | 0.76 | 0.165 | 0.17 | 0.0625 | 0.1225 | 0.195 | 0.455 | 0.455 | 0.555 | 0.1025 | 0.1375 | 0.11 | 0.245 | 0.7375 |
98
+ | No log | 32.96 | 132 | 0.8736 | 0.7625 | 0.165 | 0.1875 | 0.0625 | 0.125 | 0.195 | 0.465 | 0.45 | 0.5625 | 0.105 | 0.155 | 0.0975 | 0.275 | 0.7575 |
99
+ | No log | 33.96 | 136 | 0.8380 | 0.7625 | 0.1675 | 0.21 | 0.0625 | 0.125 | 0.195 | 0.465 | 0.4625 | 0.565 | 0.1175 | 0.13 | 0.115 | 0.3225 | 0.755 |
100
+ | No log | 34.96 | 140 | 0.8386 | 0.7725 | 0.1675 | 0.2325 | 0.0625 | 0.1275 | 0.1975 | 0.4675 | 0.4575 | 0.575 | 0.12 | 0.125 | 0.13 | 0.345 | 0.765 |
101
+ | No log | 35.96 | 144 | 0.8610 | 0.755 | 0.17 | 0.2425 | 0.0625 | 0.1275 | 0.19 | 0.4625 | 0.4725 | 0.5875 | 0.13 | 0.105 | 0.18 | 0.39 | 0.755 |
102
+ | No log | 36.96 | 148 | 0.8444 | 0.76 | 0.17 | 0.255 | 0.0625 | 0.1325 | 0.2 | 0.4575 | 0.4675 | 0.595 | 0.12 | 0.1475 | 0.22 | 0.445 | 0.7525 |
103
+ | No log | 37.96 | 152 | 0.8845 | 0.75 | 0.1725 | 0.2725 | 0.0625 | 0.1375 | 0.205 | 0.46 | 0.475 | 0.59 | 0.1425 | 0.175 | 0.255 | 0.45 | 0.7575 |
104
+ | No log | 38.96 | 156 | 0.8464 | 0.7625 | 0.18 | 0.275 | 0.0625 | 0.145 | 0.2075 | 0.4675 | 0.4825 | 0.5925 | 0.18 | 0.22 | 0.265 | 0.51 | 0.755 |
105
+ | No log | 39.96 | 160 | 0.8539 | 0.7575 | 0.1825 | 0.2825 | 0.065 | 0.1475 | 0.215 | 0.48 | 0.515 | 0.6025 | 0.2 | 0.2425 | 0.2725 | 0.5275 | 0.755 |
106
+ | No log | 40.96 | 164 | 0.8697 | 0.76 | 0.185 | 0.2775 | 0.0675 | 0.1525 | 0.23 | 0.485 | 0.5325 | 0.605 | 0.2175 | 0.21 | 0.29 | 0.53 | 0.7625 |
107
+ | No log | 41.96 | 168 | 0.8395 | 0.775 | 0.185 | 0.2825 | 0.075 | 0.16 | 0.2225 | 0.4925 | 0.54 | 0.6 | 0.225 | 0.225 | 0.2875 | 0.5725 | 0.77 |
108
+ | No log | 42.96 | 172 | 0.8570 | 0.7675 | 0.1875 | 0.285 | 0.08 | 0.1575 | 0.2275 | 0.485 | 0.5475 | 0.61 | 0.2325 | 0.2325 | 0.3075 | 0.6075 | 0.7525 |
109
+ | No log | 43.96 | 176 | 0.8462 | 0.765 | 0.195 | 0.28 | 0.08 | 0.165 | 0.2325 | 0.49 | 0.5425 | 0.6125 | 0.2475 | 0.2425 | 0.3125 | 0.6225 | 0.755 |
110
+ | No log | 44.96 | 180 | 0.8563 | 0.765 | 0.195 | 0.2825 | 0.085 | 0.1775 | 0.235 | 0.495 | 0.535 | 0.6075 | 0.2975 | 0.22 | 0.3175 | 0.62 | 0.75 |
111
+ | No log | 45.96 | 184 | 0.8670 | 0.7675 | 0.195 | 0.28 | 0.085 | 0.1825 | 0.24 | 0.4975 | 0.54 | 0.615 | 0.3525 | 0.215 | 0.325 | 0.6375 | 0.76 |
112
+ | No log | 46.96 | 188 | 0.8708 | 0.77 | 0.195 | 0.29 | 0.0925 | 0.185 | 0.2375 | 0.4975 | 0.535 | 0.6125 | 0.365 | 0.2275 | 0.3175 | 0.64 | 0.7575 |
113
+ | No log | 47.96 | 192 | 0.8535 | 0.7675 | 0.19 | 0.29 | 0.095 | 0.2075 | 0.24 | 0.4975 | 0.5375 | 0.6125 | 0.4025 | 0.24 | 0.35 | 0.6575 | 0.755 |
114
+ | No log | 48.96 | 196 | 0.8592 | 0.765 | 0.19 | 0.285 | 0.0975 | 0.2175 | 0.2425 | 0.495 | 0.54 | 0.615 | 0.4175 | 0.2375 | 0.365 | 0.6575 | 0.7475 |
115
+ | No log | 49.96 | 200 | 0.8717 | 0.765 | 0.19 | 0.2925 | 0.1 | 0.235 | 0.25 | 0.5 | 0.545 | 0.6125 | 0.4325 | 0.25 | 0.3725 | 0.66 | 0.76 |
116
+ | No log | 50.96 | 204 | 0.8684 | 0.765 | 0.1925 | 0.2975 | 0.105 | 0.245 | 0.2575 | 0.5025 | 0.545 | 0.61 | 0.4475 | 0.2775 | 0.3775 | 0.675 | 0.7625 |
117
+ | No log | 51.96 | 208 | 0.8662 | 0.76 | 0.1925 | 0.295 | 0.1025 | 0.245 | 0.2625 | 0.5025 | 0.55 | 0.6175 | 0.455 | 0.2925 | 0.39 | 0.68 | 0.76 |
118
+ | No log | 52.96 | 212 | 0.8718 | 0.7625 | 0.1925 | 0.295 | 0.1075 | 0.2525 | 0.2625 | 0.5025 | 0.55 | 0.6225 | 0.485 | 0.3075 | 0.4125 | 0.6825 | 0.755 |
119
+ | No log | 53.96 | 216 | 0.8798 | 0.76 | 0.195 | 0.295 | 0.11 | 0.265 | 0.2625 | 0.505 | 0.5475 | 0.6275 | 0.495 | 0.3175 | 0.4275 | 0.68 | 0.7475 |
120
+ | No log | 54.96 | 220 | 0.8703 | 0.7575 | 0.2 | 0.2975 | 0.11 | 0.2675 | 0.2675 | 0.5075 | 0.545 | 0.6225 | 0.4975 | 0.3275 | 0.435 | 0.6825 | 0.745 |
121
+ | No log | 55.96 | 224 | 0.8622 | 0.765 | 0.2 | 0.3 | 0.11 | 0.265 | 0.27 | 0.51 | 0.545 | 0.625 | 0.505 | 0.33 | 0.435 | 0.69 | 0.7525 |
122
+ | No log | 56.96 | 228 | 0.8590 | 0.77 | 0.2 | 0.3 | 0.11 | 0.27 | 0.2675 | 0.5125 | 0.5475 | 0.6325 | 0.5075 | 0.34 | 0.4375 | 0.6875 | 0.76 |
123
+ | No log | 57.96 | 232 | 0.8572 | 0.7725 | 0.2 | 0.3025 | 0.11 | 0.27 | 0.2675 | 0.51 | 0.5475 | 0.6325 | 0.5175 | 0.34 | 0.44 | 0.6875 | 0.7575 |
124
+ | No log | 58.96 | 236 | 0.8570 | 0.7725 | 0.2 | 0.3025 | 0.1125 | 0.2725 | 0.2675 | 0.51 | 0.55 | 0.6325 | 0.5225 | 0.34 | 0.445 | 0.6875 | 0.76 |
125
+ | No log | 59.96 | 240 | 0.8574 | 0.77 | 0.2 | 0.3 | 0.1125 | 0.2725 | 0.2675 | 0.51 | 0.55 | 0.63 | 0.525 | 0.3425 | 0.445 | 0.6875 | 0.7575 |
126
+
127
+
128
+ ### Framework versions
129
+
130
+ - Transformers 4.26.1
131
+ - Pytorch 1.13.1.post200
132
+ - Datasets 2.9.0
133
+ - Tokenizers 0.13.2