gaodrew commited on
Commit
c2f049d
1 Parent(s): f9fe0f1

Model save

Browse files
Files changed (2) hide show
  1. README.md +181 -9
  2. model.safetensors +1 -1
README.md CHANGED
@@ -13,7 +13,7 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  This model is a fine-tuned version of [](https://huggingface.co/) on the None dataset.
15
  It achieves the following results on the evaluation set:
16
- - Loss: 6.9015
17
 
18
  ## Model description
19
 
@@ -33,21 +33,193 @@ More information needed
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 0.0005
36
- - train_batch_size: 256
37
- - eval_batch_size: 256
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: cosine
41
- - lr_scheduler_warmup_steps: 2000
42
- - training_steps: 1000
43
  - mixed_precision_training: Native AMP
44
 
45
  ### Training results
46
 
47
- | Training Loss | Epoch | Step | Validation Loss |
48
- |:-------------:|:------:|:----:|:---------------:|
49
- | 7.2863 | 0.5214 | 500 | 7.1323 |
50
- | 6.8168 | 1.0428 | 1000 | 6.9015 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
51
 
52
 
53
  ### Framework versions
 
13
 
14
  This model is a fine-tuned version of [](https://huggingface.co/) on the None dataset.
15
  It achieves the following results on the evaluation set:
16
+ - Loss: 5.4614
17
 
18
  ## Model description
19
 
 
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 0.0005
36
+ - train_batch_size: 512
37
+ - eval_batch_size: 512
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: cosine
41
+ - lr_scheduler_warmup_steps: 3000
42
+ - num_epochs: 1
43
  - mixed_precision_training: Native AMP
44
 
45
  ### Training results
46
 
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:------:|:-----:|:---------------:|
49
+ | 7.172 | 0.0057 | 500 | 7.1975 |
50
+ | 6.7223 | 0.0115 | 1000 | 6.9021 |
51
+ | 6.4626 | 0.0172 | 1500 | 6.6753 |
52
+ | 6.2743 | 0.0230 | 2000 | 6.5041 |
53
+ | 6.1669 | 0.0287 | 2500 | 6.3793 |
54
+ | 6.1212 | 0.0344 | 3000 | 6.3379 |
55
+ | 6.0511 | 0.0402 | 3500 | 6.3103 |
56
+ | 5.9919 | 0.0459 | 4000 | 6.2384 |
57
+ | 5.9364 | 0.0516 | 4500 | 6.2122 |
58
+ | 5.887 | 0.0574 | 5000 | 6.1205 |
59
+ | 5.8564 | 0.0631 | 5500 | 6.1243 |
60
+ | 5.8264 | 0.0689 | 6000 | 6.0895 |
61
+ | 5.7942 | 0.0746 | 6500 | 6.0713 |
62
+ | 5.7826 | 0.0803 | 7000 | 6.0359 |
63
+ | 5.7475 | 0.0861 | 7500 | 6.0351 |
64
+ | 5.7329 | 0.0918 | 8000 | 6.0039 |
65
+ | 5.7127 | 0.0975 | 8500 | 5.9915 |
66
+ | 5.6971 | 0.1033 | 9000 | 5.9835 |
67
+ | 5.6825 | 0.1090 | 9500 | 5.9810 |
68
+ | 5.6695 | 0.1148 | 10000 | 5.9542 |
69
+ | 5.6537 | 0.1205 | 10500 | 5.9367 |
70
+ | 5.6455 | 0.1262 | 11000 | 5.9178 |
71
+ | 5.6254 | 0.1320 | 11500 | 5.9097 |
72
+ | 5.62 | 0.1377 | 12000 | 5.9092 |
73
+ | 5.608 | 0.1434 | 12500 | 5.8881 |
74
+ | 5.5993 | 0.1492 | 13000 | 5.8807 |
75
+ | 5.5891 | 0.1549 | 13500 | 5.8707 |
76
+ | 5.5791 | 0.1607 | 14000 | 5.8809 |
77
+ | 5.5701 | 0.1664 | 14500 | 5.8585 |
78
+ | 5.5651 | 0.1721 | 15000 | 5.8436 |
79
+ | 5.5595 | 0.1779 | 15500 | 5.8607 |
80
+ | 5.5509 | 0.1836 | 16000 | 5.8308 |
81
+ | 5.5401 | 0.1894 | 16500 | 5.8381 |
82
+ | 5.535 | 0.1951 | 17000 | 5.8749 |
83
+ | 5.5281 | 0.2008 | 17500 | 5.8331 |
84
+ | 5.5231 | 0.2066 | 18000 | 5.8139 |
85
+ | 5.5148 | 0.2123 | 18500 | 5.8078 |
86
+ | 5.5112 | 0.2180 | 19000 | 5.8016 |
87
+ | 5.5049 | 0.2238 | 19500 | 5.8034 |
88
+ | 5.5006 | 0.2295 | 20000 | 5.8025 |
89
+ | 5.4909 | 0.2353 | 20500 | 5.8017 |
90
+ | 5.4835 | 0.2410 | 21000 | 5.7782 |
91
+ | 5.4841 | 0.2467 | 21500 | 5.7862 |
92
+ | 5.4794 | 0.2525 | 22000 | 5.7690 |
93
+ | 5.476 | 0.2582 | 22500 | 5.7689 |
94
+ | 5.4668 | 0.2639 | 23000 | 5.7806 |
95
+ | 5.4585 | 0.2697 | 23500 | 5.7678 |
96
+ | 5.4573 | 0.2754 | 24000 | 5.7499 |
97
+ | 5.4551 | 0.2812 | 24500 | 5.7696 |
98
+ | 5.451 | 0.2869 | 25000 | 5.7564 |
99
+ | 5.4465 | 0.2926 | 25500 | 5.7508 |
100
+ | 5.4396 | 0.2984 | 26000 | 5.7414 |
101
+ | 5.4356 | 0.3041 | 26500 | 5.7354 |
102
+ | 5.4321 | 0.3098 | 27000 | 5.7471 |
103
+ | 5.427 | 0.3156 | 27500 | 5.7296 |
104
+ | 5.4242 | 0.3213 | 28000 | 5.7294 |
105
+ | 5.4192 | 0.3271 | 28500 | 5.7252 |
106
+ | 5.4168 | 0.3328 | 29000 | 5.7183 |
107
+ | 5.4135 | 0.3385 | 29500 | 5.7241 |
108
+ | 5.4077 | 0.3443 | 30000 | 5.7148 |
109
+ | 5.4051 | 0.3500 | 30500 | 5.7215 |
110
+ | 5.3994 | 0.3557 | 31000 | 5.7140 |
111
+ | 5.3992 | 0.3615 | 31500 | 5.7079 |
112
+ | 5.3902 | 0.3672 | 32000 | 5.7057 |
113
+ | 5.3848 | 0.3730 | 32500 | 5.7047 |
114
+ | 5.3865 | 0.3787 | 33000 | 5.6973 |
115
+ | 5.3824 | 0.3844 | 33500 | 5.6938 |
116
+ | 5.3769 | 0.3902 | 34000 | 5.6950 |
117
+ | 5.3733 | 0.3959 | 34500 | 5.6885 |
118
+ | 5.3694 | 0.4017 | 35000 | 5.6819 |
119
+ | 5.3638 | 0.4074 | 35500 | 5.6770 |
120
+ | 5.3611 | 0.4131 | 36000 | 5.6819 |
121
+ | 5.3615 | 0.4189 | 36500 | 5.6705 |
122
+ | 5.354 | 0.4246 | 37000 | 5.6757 |
123
+ | 5.3522 | 0.4303 | 37500 | 5.6718 |
124
+ | 5.3422 | 0.4361 | 38000 | 5.6679 |
125
+ | 5.343 | 0.4418 | 38500 | 5.6655 |
126
+ | 5.3434 | 0.4476 | 39000 | 5.6591 |
127
+ | 5.3385 | 0.4533 | 39500 | 5.6608 |
128
+ | 5.3325 | 0.4590 | 40000 | 5.6629 |
129
+ | 5.3315 | 0.4648 | 40500 | 5.6581 |
130
+ | 5.3317 | 0.4705 | 41000 | 5.6534 |
131
+ | 5.3275 | 0.4762 | 41500 | 5.6447 |
132
+ | 5.3202 | 0.4820 | 42000 | 5.6451 |
133
+ | 5.3149 | 0.4877 | 42500 | 5.6348 |
134
+ | 5.313 | 0.4935 | 43000 | 5.6366 |
135
+ | 5.3122 | 0.4992 | 43500 | 5.6384 |
136
+ | 5.3065 | 0.5049 | 44000 | 5.6326 |
137
+ | 5.299 | 0.5107 | 44500 | 5.6226 |
138
+ | 5.2997 | 0.5164 | 45000 | 5.6301 |
139
+ | 5.2959 | 0.5221 | 45500 | 5.6172 |
140
+ | 5.2907 | 0.5279 | 46000 | 5.6232 |
141
+ | 5.2889 | 0.5336 | 46500 | 5.6239 |
142
+ | 5.288 | 0.5394 | 47000 | 5.6069 |
143
+ | 5.277 | 0.5451 | 47500 | 5.6154 |
144
+ | 5.2765 | 0.5508 | 48000 | 5.6157 |
145
+ | 5.2739 | 0.5566 | 48500 | 5.6035 |
146
+ | 5.2693 | 0.5623 | 49000 | 5.6009 |
147
+ | 5.2635 | 0.5681 | 49500 | 5.5978 |
148
+ | 5.2682 | 0.5738 | 50000 | 5.5987 |
149
+ | 5.258 | 0.5795 | 50500 | 5.5971 |
150
+ | 5.26 | 0.5853 | 51000 | 5.5994 |
151
+ | 5.2568 | 0.5910 | 51500 | 5.5873 |
152
+ | 5.2446 | 0.5967 | 52000 | 5.5771 |
153
+ | 5.2469 | 0.6025 | 52500 | 5.5824 |
154
+ | 5.2459 | 0.6082 | 53000 | 5.5853 |
155
+ | 5.239 | 0.6140 | 53500 | 5.5781 |
156
+ | 5.2355 | 0.6197 | 54000 | 5.5729 |
157
+ | 5.2296 | 0.6254 | 54500 | 5.5737 |
158
+ | 5.2301 | 0.6312 | 55000 | 5.5656 |
159
+ | 5.2277 | 0.6369 | 55500 | 5.5716 |
160
+ | 5.2197 | 0.6426 | 56000 | 5.5583 |
161
+ | 5.2131 | 0.6484 | 56500 | 5.5639 |
162
+ | 5.2132 | 0.6541 | 57000 | 5.5537 |
163
+ | 5.2103 | 0.6599 | 57500 | 5.5656 |
164
+ | 5.2078 | 0.6656 | 58000 | 5.5524 |
165
+ | 5.203 | 0.6713 | 58500 | 5.5470 |
166
+ | 5.2035 | 0.6771 | 59000 | 5.5454 |
167
+ | 5.1963 | 0.6828 | 59500 | 5.5428 |
168
+ | 5.1932 | 0.6885 | 60000 | 5.5355 |
169
+ | 5.1906 | 0.6943 | 60500 | 5.5338 |
170
+ | 5.1864 | 0.7000 | 61000 | 5.5352 |
171
+ | 5.1823 | 0.7058 | 61500 | 5.5295 |
172
+ | 5.179 | 0.7115 | 62000 | 5.5296 |
173
+ | 5.1752 | 0.7172 | 62500 | 5.5259 |
174
+ | 5.1726 | 0.7230 | 63000 | 5.5291 |
175
+ | 5.1692 | 0.7287 | 63500 | 5.5183 |
176
+ | 5.1694 | 0.7345 | 64000 | 5.5173 |
177
+ | 5.1624 | 0.7402 | 64500 | 5.5162 |
178
+ | 5.161 | 0.7459 | 65000 | 5.5104 |
179
+ | 5.1588 | 0.7517 | 65500 | 5.5145 |
180
+ | 5.156 | 0.7574 | 66000 | 5.5057 |
181
+ | 5.1539 | 0.7631 | 66500 | 5.5036 |
182
+ | 5.1474 | 0.7689 | 67000 | 5.5093 |
183
+ | 5.1457 | 0.7746 | 67500 | 5.5052 |
184
+ | 5.1444 | 0.7804 | 68000 | 5.4979 |
185
+ | 5.1437 | 0.7861 | 68500 | 5.4979 |
186
+ | 5.1402 | 0.7918 | 69000 | 5.5009 |
187
+ | 5.1346 | 0.7976 | 69500 | 5.4907 |
188
+ | 5.1308 | 0.8033 | 70000 | 5.4905 |
189
+ | 5.1325 | 0.8090 | 70500 | 5.4880 |
190
+ | 5.1297 | 0.8148 | 71000 | 5.4866 |
191
+ | 5.1246 | 0.8205 | 71500 | 5.4871 |
192
+ | 5.1232 | 0.8263 | 72000 | 5.4846 |
193
+ | 5.1232 | 0.8320 | 72500 | 5.4840 |
194
+ | 5.1219 | 0.8377 | 73000 | 5.4811 |
195
+ | 5.1126 | 0.8435 | 73500 | 5.4791 |
196
+ | 5.1168 | 0.8492 | 74000 | 5.4782 |
197
+ | 5.1175 | 0.8549 | 74500 | 5.4763 |
198
+ | 5.1108 | 0.8607 | 75000 | 5.4771 |
199
+ | 5.1082 | 0.8664 | 75500 | 5.4742 |
200
+ | 5.1053 | 0.8722 | 76000 | 5.4738 |
201
+ | 5.107 | 0.8779 | 76500 | 5.4718 |
202
+ | 5.1074 | 0.8836 | 77000 | 5.4694 |
203
+ | 5.1074 | 0.8894 | 77500 | 5.4692 |
204
+ | 5.1038 | 0.8951 | 78000 | 5.4697 |
205
+ | 5.105 | 0.9008 | 78500 | 5.4684 |
206
+ | 5.1019 | 0.9066 | 79000 | 5.4665 |
207
+ | 5.1015 | 0.9123 | 79500 | 5.4663 |
208
+ | 5.101 | 0.9181 | 80000 | 5.4672 |
209
+ | 5.1022 | 0.9238 | 80500 | 5.4654 |
210
+ | 5.1 | 0.9295 | 81000 | 5.4632 |
211
+ | 5.0981 | 0.9353 | 81500 | 5.4637 |
212
+ | 5.0981 | 0.9410 | 82000 | 5.4630 |
213
+ | 5.0951 | 0.9468 | 82500 | 5.4619 |
214
+ | 5.0941 | 0.9525 | 83000 | 5.4628 |
215
+ | 5.0947 | 0.9582 | 83500 | 5.4621 |
216
+ | 5.0949 | 0.9640 | 84000 | 5.4625 |
217
+ | 5.0971 | 0.9697 | 84500 | 5.4618 |
218
+ | 5.0895 | 0.9754 | 85000 | 5.4619 |
219
+ | 5.0937 | 0.9812 | 85500 | 5.4616 |
220
+ | 5.0978 | 0.9869 | 86000 | 5.4615 |
221
+ | 5.0958 | 0.9927 | 86500 | 5.4614 |
222
+ | 5.095 | 0.9984 | 87000 | 5.4614 |
223
 
224
 
225
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:370dc4c73fbafd1129ea418b2b0769fefad4ac318bd5dcfb80867f90dd1dc386
3
  size 373739136
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:11f34800f460eccf4a3974cc6894aefabf6674ccdf2763eae554d47ea6054db3
3
  size 373739136