jordyvl commited on
Commit
5e6a15f
1 Parent(s): 1ecf6b3

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +125 -0
README.md ADDED
@@ -0,0 +1,125 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-sa-4.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-09-30_ent_75_gates_exitloss
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-09-30_ent_75_gates_exitloss
16
+
17
+ This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 1.0087
20
+ - Accuracy: 0.7475
21
+ - Exit 0 Accuracy: 0.065
22
+ - Exit 1 Accuracy: 0.1175
23
+ - Exit 2 Accuracy: 0.085
24
+ - Exit 3 Accuracy: 0.0925
25
+ - Exit 4 Accuracy: 0.0625
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 16
46
+ - eval_batch_size: 4
47
+ - seed: 42
48
+ - gradient_accumulation_steps: 12
49
+ - total_train_batch_size: 192
50
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
+ - lr_scheduler_type: linear
52
+ - num_epochs: 60
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Exit 0 Accuracy | Exit 1 Accuracy | Exit 2 Accuracy | Exit 3 Accuracy | Exit 4 Accuracy |
57
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|
58
+ | No log | 0.96 | 4 | 2.7344 | 0.1275 | 0.0625 | 0.0925 | 0.0825 | 0.0625 | 0.0625 |
59
+ | No log | 1.96 | 8 | 2.6660 | 0.1725 | 0.0625 | 0.1025 | 0.085 | 0.0625 | 0.0625 |
60
+ | No log | 2.96 | 12 | 2.6216 | 0.1975 | 0.065 | 0.1075 | 0.085 | 0.0625 | 0.0625 |
61
+ | No log | 3.96 | 16 | 2.5681 | 0.2275 | 0.0675 | 0.1075 | 0.0775 | 0.0625 | 0.0625 |
62
+ | No log | 4.96 | 20 | 2.5015 | 0.2475 | 0.05 | 0.1125 | 0.0775 | 0.0625 | 0.0625 |
63
+ | No log | 5.96 | 24 | 2.4350 | 0.265 | 0.05 | 0.115 | 0.075 | 0.0625 | 0.0625 |
64
+ | No log | 6.96 | 28 | 2.3608 | 0.29 | 0.0525 | 0.115 | 0.075 | 0.0625 | 0.0625 |
65
+ | No log | 7.96 | 32 | 2.2703 | 0.3175 | 0.065 | 0.115 | 0.08 | 0.0625 | 0.0625 |
66
+ | No log | 8.96 | 36 | 2.1981 | 0.3375 | 0.07 | 0.11 | 0.07 | 0.0625 | 0.0625 |
67
+ | No log | 9.96 | 40 | 2.0994 | 0.4 | 0.0675 | 0.1075 | 0.0675 | 0.0625 | 0.0625 |
68
+ | No log | 10.96 | 44 | 1.9883 | 0.4525 | 0.0675 | 0.1075 | 0.0675 | 0.0625 | 0.0625 |
69
+ | No log | 11.96 | 48 | 1.9012 | 0.5 | 0.0675 | 0.105 | 0.0675 | 0.0625 | 0.0625 |
70
+ | No log | 12.96 | 52 | 1.7577 | 0.5675 | 0.0675 | 0.1 | 0.065 | 0.0625 | 0.0625 |
71
+ | No log | 13.96 | 56 | 1.6503 | 0.5775 | 0.0675 | 0.1125 | 0.065 | 0.0625 | 0.0625 |
72
+ | No log | 14.96 | 60 | 1.5502 | 0.605 | 0.0675 | 0.1125 | 0.065 | 0.0625 | 0.0625 |
73
+ | No log | 15.96 | 64 | 1.4495 | 0.6525 | 0.0675 | 0.1125 | 0.0775 | 0.0625 | 0.0625 |
74
+ | No log | 16.96 | 68 | 1.3666 | 0.6625 | 0.0675 | 0.1125 | 0.0725 | 0.0625 | 0.0625 |
75
+ | No log | 17.96 | 72 | 1.2893 | 0.6875 | 0.0675 | 0.1125 | 0.08 | 0.0625 | 0.0625 |
76
+ | No log | 18.96 | 76 | 1.2417 | 0.6925 | 0.07 | 0.1125 | 0.0875 | 0.0625 | 0.0625 |
77
+ | No log | 19.96 | 80 | 1.1923 | 0.695 | 0.07 | 0.1125 | 0.09 | 0.0625 | 0.0625 |
78
+ | No log | 20.96 | 84 | 1.1585 | 0.7025 | 0.065 | 0.1125 | 0.0925 | 0.0625 | 0.0625 |
79
+ | No log | 21.96 | 88 | 1.1008 | 0.72 | 0.0575 | 0.1125 | 0.09 | 0.0625 | 0.0625 |
80
+ | No log | 22.96 | 92 | 1.0846 | 0.7125 | 0.0525 | 0.1125 | 0.0925 | 0.0625 | 0.0625 |
81
+ | No log | 23.96 | 96 | 1.0559 | 0.7375 | 0.0525 | 0.1125 | 0.09 | 0.0625 | 0.0625 |
82
+ | No log | 24.96 | 100 | 1.0155 | 0.7375 | 0.0525 | 0.1125 | 0.0875 | 0.0625 | 0.0625 |
83
+ | No log | 25.96 | 104 | 1.0025 | 0.7475 | 0.0525 | 0.1125 | 0.085 | 0.0625 | 0.0625 |
84
+ | No log | 26.96 | 108 | 0.9911 | 0.74 | 0.0575 | 0.1125 | 0.0875 | 0.0625 | 0.0625 |
85
+ | No log | 27.96 | 112 | 0.9721 | 0.755 | 0.0575 | 0.115 | 0.0875 | 0.0625 | 0.0625 |
86
+ | No log | 28.96 | 116 | 0.9727 | 0.7475 | 0.0575 | 0.115 | 0.0775 | 0.0625 | 0.0625 |
87
+ | No log | 29.96 | 120 | 0.9829 | 0.7325 | 0.0575 | 0.1175 | 0.0725 | 0.0625 | 0.0625 |
88
+ | No log | 30.96 | 124 | 0.9521 | 0.75 | 0.055 | 0.1175 | 0.0625 | 0.0625 | 0.0625 |
89
+ | No log | 31.96 | 128 | 0.9669 | 0.735 | 0.05 | 0.1175 | 0.06 | 0.0625 | 0.0625 |
90
+ | No log | 32.96 | 132 | 0.9490 | 0.7575 | 0.0525 | 0.12 | 0.065 | 0.065 | 0.0625 |
91
+ | No log | 33.96 | 136 | 0.9476 | 0.745 | 0.0525 | 0.12 | 0.07 | 0.065 | 0.0625 |
92
+ | No log | 34.96 | 140 | 0.9559 | 0.755 | 0.0575 | 0.12 | 0.0625 | 0.065 | 0.0625 |
93
+ | No log | 35.96 | 144 | 0.9475 | 0.75 | 0.0525 | 0.12 | 0.065 | 0.0675 | 0.0625 |
94
+ | No log | 36.96 | 148 | 0.9643 | 0.74 | 0.0525 | 0.12 | 0.0625 | 0.07 | 0.0625 |
95
+ | No log | 37.96 | 152 | 0.9589 | 0.7425 | 0.0525 | 0.12 | 0.0575 | 0.075 | 0.0625 |
96
+ | No log | 38.96 | 156 | 0.9499 | 0.7525 | 0.0525 | 0.12 | 0.055 | 0.075 | 0.0625 |
97
+ | No log | 39.96 | 160 | 0.9775 | 0.74 | 0.06 | 0.12 | 0.0575 | 0.0775 | 0.0625 |
98
+ | No log | 40.96 | 164 | 0.9579 | 0.7375 | 0.0575 | 0.12 | 0.0575 | 0.0775 | 0.0625 |
99
+ | No log | 41.96 | 168 | 0.9795 | 0.74 | 0.0575 | 0.12 | 0.06 | 0.0825 | 0.0625 |
100
+ | No log | 42.96 | 172 | 0.9802 | 0.75 | 0.055 | 0.12 | 0.06 | 0.08 | 0.0625 |
101
+ | No log | 43.96 | 176 | 0.9660 | 0.7525 | 0.06 | 0.1175 | 0.0625 | 0.075 | 0.0625 |
102
+ | No log | 44.96 | 180 | 0.9677 | 0.76 | 0.07 | 0.1175 | 0.07 | 0.0775 | 0.0625 |
103
+ | No log | 45.96 | 184 | 0.9818 | 0.745 | 0.07 | 0.1175 | 0.075 | 0.085 | 0.0625 |
104
+ | No log | 46.96 | 188 | 0.9764 | 0.755 | 0.0675 | 0.1175 | 0.0825 | 0.09 | 0.0625 |
105
+ | No log | 47.96 | 192 | 0.9793 | 0.755 | 0.065 | 0.1175 | 0.085 | 0.0875 | 0.0625 |
106
+ | No log | 48.96 | 196 | 0.9823 | 0.75 | 0.065 | 0.1175 | 0.0825 | 0.09 | 0.0625 |
107
+ | No log | 49.96 | 200 | 0.9897 | 0.7475 | 0.0625 | 0.1175 | 0.085 | 0.0875 | 0.0625 |
108
+ | No log | 50.96 | 204 | 1.0066 | 0.755 | 0.065 | 0.1175 | 0.085 | 0.09 | 0.0625 |
109
+ | No log | 51.96 | 208 | 1.0096 | 0.7425 | 0.065 | 0.1175 | 0.085 | 0.09 | 0.0625 |
110
+ | No log | 52.96 | 212 | 1.0037 | 0.75 | 0.065 | 0.1175 | 0.085 | 0.0925 | 0.0625 |
111
+ | No log | 53.96 | 216 | 1.0060 | 0.75 | 0.065 | 0.1175 | 0.085 | 0.095 | 0.0625 |
112
+ | No log | 54.96 | 220 | 1.0032 | 0.75 | 0.0625 | 0.1175 | 0.0825 | 0.095 | 0.0625 |
113
+ | No log | 55.96 | 224 | 1.0001 | 0.755 | 0.065 | 0.1175 | 0.0825 | 0.0925 | 0.0625 |
114
+ | No log | 56.96 | 228 | 1.0023 | 0.7525 | 0.065 | 0.1175 | 0.085 | 0.0925 | 0.0625 |
115
+ | No log | 57.96 | 232 | 1.0058 | 0.7475 | 0.0625 | 0.1175 | 0.085 | 0.0925 | 0.0625 |
116
+ | No log | 58.96 | 236 | 1.0079 | 0.7475 | 0.065 | 0.1175 | 0.085 | 0.0925 | 0.0625 |
117
+ | No log | 59.96 | 240 | 1.0087 | 0.7475 | 0.065 | 0.1175 | 0.085 | 0.0925 | 0.0625 |
118
+
119
+
120
+ ### Framework versions
121
+
122
+ - Transformers 4.26.1
123
+ - Pytorch 1.13.1.post200
124
+ - Datasets 2.9.0
125
+ - Tokenizers 0.13.2