woody72 commited on
Commit
e95ad8c
1 Parent(s): 41f877a

Model save

Browse files
README.md ADDED
@@ -0,0 +1,529 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ base_model: deepseek-ai/deepseek-math-7b-base
4
+ tags:
5
+ - trl
6
+ - sft
7
+ - generated_from_trainer
8
+ model-index:
9
+ - name: albert-no-variable-items-length
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # albert-no-variable-items-length
17
+
18
+ This model is a fine-tuned version of [deepseek-ai/deepseek-math-7b-base](https://huggingface.co/deepseek-ai/deepseek-math-7b-base) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.0201
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 3e-06
40
+ - train_batch_size: 16
41
+ - eval_batch_size: 8
42
+ - seed: 42
43
+ - gradient_accumulation_steps: 4
44
+ - total_train_batch_size: 64
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: constant_with_warmup
47
+ - lr_scheduler_warmup_steps: 1
48
+ - num_epochs: 3
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss |
53
+ |:-------------:|:-----:|:----:|:---------------:|
54
+ | 0.7887 | 0.01 | 1 | 0.3267 |
55
+ | 0.8195 | 0.01 | 2 | 0.3220 |
56
+ | 0.7704 | 0.02 | 3 | 0.3179 |
57
+ | 0.7745 | 0.03 | 4 | 0.3148 |
58
+ | 0.7651 | 0.03 | 5 | 0.3115 |
59
+ | 0.7484 | 0.04 | 6 | 0.3085 |
60
+ | 0.7621 | 0.04 | 7 | 0.3065 |
61
+ | 0.7409 | 0.05 | 8 | 0.3033 |
62
+ | 0.7138 | 0.06 | 9 | 0.3003 |
63
+ | 0.7204 | 0.06 | 10 | 0.2985 |
64
+ | 0.6912 | 0.07 | 11 | 0.2952 |
65
+ | 0.7163 | 0.08 | 12 | 0.2922 |
66
+ | 0.6911 | 0.08 | 13 | 0.2901 |
67
+ | 0.6774 | 0.09 | 14 | 0.2878 |
68
+ | 0.6692 | 0.1 | 15 | 0.2851 |
69
+ | 0.6055 | 0.1 | 16 | 0.2825 |
70
+ | 0.4833 | 0.11 | 17 | 0.2801 |
71
+ | 0.4824 | 0.12 | 18 | 0.2765 |
72
+ | 0.4753 | 0.12 | 19 | 0.2752 |
73
+ | 0.4653 | 0.13 | 20 | 0.2722 |
74
+ | 0.466 | 0.13 | 21 | 0.2696 |
75
+ | 0.4599 | 0.14 | 22 | 0.2685 |
76
+ | 0.4484 | 0.15 | 23 | 0.2647 |
77
+ | 0.4508 | 0.15 | 24 | 0.2615 |
78
+ | 0.4436 | 0.16 | 25 | 0.2594 |
79
+ | 0.4526 | 0.17 | 26 | 0.2565 |
80
+ | 0.4332 | 0.17 | 27 | 0.2551 |
81
+ | 0.4246 | 0.18 | 28 | 0.2515 |
82
+ | 0.4241 | 0.19 | 29 | 0.2480 |
83
+ | 0.4173 | 0.19 | 30 | 0.2453 |
84
+ | 0.4031 | 0.2 | 31 | 0.2435 |
85
+ | 0.4122 | 0.2 | 32 | 0.2400 |
86
+ | 0.408 | 0.21 | 33 | 0.2386 |
87
+ | 0.3971 | 0.22 | 34 | 0.2361 |
88
+ | 0.4002 | 0.22 | 35 | 0.2337 |
89
+ | 0.3881 | 0.23 | 36 | 0.2310 |
90
+ | 0.3965 | 0.24 | 37 | 0.2272 |
91
+ | 0.3731 | 0.24 | 38 | 0.2245 |
92
+ | 0.3743 | 0.25 | 39 | 0.2211 |
93
+ | 0.3625 | 0.26 | 40 | 0.2191 |
94
+ | 0.3619 | 0.26 | 41 | 0.2167 |
95
+ | 0.3557 | 0.27 | 42 | 0.2149 |
96
+ | 0.3539 | 0.28 | 43 | 0.2107 |
97
+ | 0.3367 | 0.28 | 44 | 0.2089 |
98
+ | 0.3427 | 0.29 | 45 | 0.2061 |
99
+ | 0.333 | 0.29 | 46 | 0.2026 |
100
+ | 0.308 | 0.3 | 47 | 0.1996 |
101
+ | 0.2572 | 0.31 | 48 | 0.1969 |
102
+ | 0.2568 | 0.31 | 49 | 0.1934 |
103
+ | 0.2452 | 0.32 | 50 | 0.1915 |
104
+ | 0.2406 | 0.33 | 51 | 0.1874 |
105
+ | 0.2395 | 0.33 | 52 | 0.1834 |
106
+ | 0.2337 | 0.34 | 53 | 0.1805 |
107
+ | 0.2223 | 0.35 | 54 | 0.1767 |
108
+ | 0.2295 | 0.35 | 55 | 0.1732 |
109
+ | 0.222 | 0.36 | 56 | 0.1692 |
110
+ | 0.2174 | 0.36 | 57 | 0.1653 |
111
+ | 0.2064 | 0.37 | 58 | 0.1637 |
112
+ | 0.2075 | 0.38 | 59 | 0.1591 |
113
+ | 0.2014 | 0.38 | 60 | 0.1559 |
114
+ | 0.1963 | 0.39 | 61 | 0.1519 |
115
+ | 0.1908 | 0.4 | 62 | 0.1485 |
116
+ | 0.1963 | 0.4 | 63 | 0.1445 |
117
+ | 0.1793 | 0.41 | 64 | 0.1408 |
118
+ | 0.179 | 0.42 | 65 | 0.1377 |
119
+ | 0.1645 | 0.42 | 66 | 0.1334 |
120
+ | 0.1651 | 0.43 | 67 | 0.1299 |
121
+ | 0.1632 | 0.44 | 68 | 0.1264 |
122
+ | 0.1549 | 0.44 | 69 | 0.1224 |
123
+ | 0.1479 | 0.45 | 70 | 0.1186 |
124
+ | 0.1491 | 0.45 | 71 | 0.1153 |
125
+ | 0.1397 | 0.46 | 72 | 0.1109 |
126
+ | 0.1384 | 0.47 | 73 | 0.1069 |
127
+ | 0.1308 | 0.47 | 74 | 0.1025 |
128
+ | 0.1144 | 0.48 | 75 | 0.0987 |
129
+ | 0.1191 | 0.49 | 76 | 0.0946 |
130
+ | 0.1132 | 0.49 | 77 | 0.0910 |
131
+ | 0.1057 | 0.5 | 78 | 0.0867 |
132
+ | 0.0911 | 0.51 | 79 | 0.0841 |
133
+ | 0.1005 | 0.51 | 80 | 0.0812 |
134
+ | 0.0885 | 0.52 | 81 | 0.0790 |
135
+ | 0.091 | 0.52 | 82 | 0.0766 |
136
+ | 0.0841 | 0.53 | 83 | 0.0742 |
137
+ | 0.0808 | 0.54 | 84 | 0.0728 |
138
+ | 0.0752 | 0.54 | 85 | 0.0708 |
139
+ | 0.0717 | 0.55 | 86 | 0.0690 |
140
+ | 0.0767 | 0.56 | 87 | 0.0676 |
141
+ | 0.069 | 0.56 | 88 | 0.0658 |
142
+ | 0.0721 | 0.57 | 89 | 0.0643 |
143
+ | 0.074 | 0.58 | 90 | 0.0635 |
144
+ | 0.0633 | 0.58 | 91 | 0.0621 |
145
+ | 0.0706 | 0.59 | 92 | 0.0609 |
146
+ | 0.0575 | 0.6 | 93 | 0.0587 |
147
+ | 0.0641 | 0.6 | 94 | 0.0584 |
148
+ | 0.0574 | 0.61 | 95 | 0.0572 |
149
+ | 0.0658 | 0.61 | 96 | 0.0567 |
150
+ | 0.0696 | 0.62 | 97 | 0.0553 |
151
+ | 0.0603 | 0.63 | 98 | 0.0544 |
152
+ | 0.0571 | 0.63 | 99 | 0.0537 |
153
+ | 0.057 | 0.64 | 100 | 0.0527 |
154
+ | 0.0569 | 0.65 | 101 | 0.0514 |
155
+ | 0.0521 | 0.65 | 102 | 0.0511 |
156
+ | 0.052 | 0.66 | 103 | 0.0501 |
157
+ | 0.0529 | 0.67 | 104 | 0.0490 |
158
+ | 0.0523 | 0.67 | 105 | 0.0490 |
159
+ | 0.0567 | 0.68 | 106 | 0.0480 |
160
+ | 0.0547 | 0.68 | 107 | 0.0475 |
161
+ | 0.0509 | 0.69 | 108 | 0.0468 |
162
+ | 0.0506 | 0.7 | 109 | 0.0462 |
163
+ | 0.0533 | 0.7 | 110 | 0.0458 |
164
+ | 0.0544 | 0.71 | 111 | 0.0449 |
165
+ | 0.0521 | 0.72 | 112 | 0.0442 |
166
+ | 0.0527 | 0.72 | 113 | 0.0439 |
167
+ | 0.0495 | 0.73 | 114 | 0.0436 |
168
+ | 0.0471 | 0.74 | 115 | 0.0430 |
169
+ | 0.0442 | 0.74 | 116 | 0.0424 |
170
+ | 0.0432 | 0.75 | 117 | 0.0421 |
171
+ | 0.0451 | 0.76 | 118 | 0.0420 |
172
+ | 0.0498 | 0.76 | 119 | 0.0409 |
173
+ | 0.0468 | 0.77 | 120 | 0.0409 |
174
+ | 0.0474 | 0.77 | 121 | 0.0403 |
175
+ | 0.0462 | 0.78 | 122 | 0.0399 |
176
+ | 0.037 | 0.79 | 123 | 0.0397 |
177
+ | 0.041 | 0.79 | 124 | 0.0393 |
178
+ | 0.045 | 0.8 | 125 | 0.0387 |
179
+ | 0.045 | 0.81 | 126 | 0.0393 |
180
+ | 0.0416 | 0.81 | 127 | 0.0381 |
181
+ | 0.0418 | 0.82 | 128 | 0.0383 |
182
+ | 0.0382 | 0.83 | 129 | 0.0377 |
183
+ | 0.0444 | 0.83 | 130 | 0.0376 |
184
+ | 0.0404 | 0.84 | 131 | 0.0373 |
185
+ | 0.0438 | 0.84 | 132 | 0.0369 |
186
+ | 0.0411 | 0.85 | 133 | 0.0365 |
187
+ | 0.041 | 0.86 | 134 | 0.0364 |
188
+ | 0.0425 | 0.86 | 135 | 0.0362 |
189
+ | 0.0418 | 0.87 | 136 | 0.0356 |
190
+ | 0.0407 | 0.88 | 137 | 0.0358 |
191
+ | 0.04 | 0.88 | 138 | 0.0362 |
192
+ | 0.0382 | 0.89 | 139 | 0.0358 |
193
+ | 0.0415 | 0.9 | 140 | 0.0351 |
194
+ | 0.0374 | 0.9 | 141 | 0.0353 |
195
+ | 0.0377 | 0.91 | 142 | 0.0350 |
196
+ | 0.0368 | 0.92 | 143 | 0.0348 |
197
+ | 0.0389 | 0.92 | 144 | 0.0348 |
198
+ | 0.035 | 0.93 | 145 | 0.0344 |
199
+ | 0.0377 | 0.93 | 146 | 0.0346 |
200
+ | 0.0394 | 0.94 | 147 | 0.0344 |
201
+ | 0.0349 | 0.95 | 148 | 0.0344 |
202
+ | 0.0379 | 0.95 | 149 | 0.0341 |
203
+ | 0.0336 | 0.96 | 150 | 0.0340 |
204
+ | 0.037 | 0.97 | 151 | 0.0338 |
205
+ | 0.0338 | 0.97 | 152 | 0.0337 |
206
+ | 0.0384 | 0.98 | 153 | 0.0337 |
207
+ | 0.0385 | 0.99 | 154 | 0.0333 |
208
+ | 0.0345 | 0.99 | 155 | 0.0333 |
209
+ | 0.0354 | 1.0 | 156 | 0.0330 |
210
+ | 0.0359 | 1.0 | 157 | 0.0324 |
211
+ | 0.0372 | 1.01 | 158 | 0.0328 |
212
+ | 0.0337 | 1.02 | 159 | 0.0321 |
213
+ | 0.0344 | 1.02 | 160 | 0.0322 |
214
+ | 0.0351 | 1.03 | 161 | 0.0319 |
215
+ | 0.0324 | 1.04 | 162 | 0.0324 |
216
+ | 0.034 | 1.04 | 163 | 0.0320 |
217
+ | 0.0287 | 1.05 | 164 | 0.0321 |
218
+ | 0.03 | 1.06 | 165 | 0.0320 |
219
+ | 0.0314 | 1.06 | 166 | 0.0319 |
220
+ | 0.0275 | 1.07 | 167 | 0.0315 |
221
+ | 0.0213 | 1.08 | 168 | 0.0321 |
222
+ | 0.0277 | 1.08 | 169 | 0.0321 |
223
+ | 0.0275 | 1.09 | 170 | 0.0319 |
224
+ | 0.0182 | 1.09 | 171 | 0.0320 |
225
+ | 0.0236 | 1.1 | 172 | 0.0319 |
226
+ | 0.0319 | 1.11 | 173 | 0.0312 |
227
+ | 0.0358 | 1.11 | 174 | 0.0319 |
228
+ | 0.0328 | 1.12 | 175 | 0.0319 |
229
+ | 0.0337 | 1.13 | 176 | 0.0318 |
230
+ | 0.0382 | 1.13 | 177 | 0.0319 |
231
+ | 0.0315 | 1.14 | 178 | 0.0312 |
232
+ | 0.0308 | 1.15 | 179 | 0.0320 |
233
+ | 0.0314 | 1.15 | 180 | 0.0325 |
234
+ | 0.0312 | 1.16 | 181 | 0.0324 |
235
+ | 0.0326 | 1.16 | 182 | 0.0327 |
236
+ | 0.0278 | 1.17 | 183 | 0.0332 |
237
+ | 0.0264 | 1.18 | 184 | 0.0329 |
238
+ | 0.0307 | 1.18 | 185 | 0.0334 |
239
+ | 0.0255 | 1.19 | 186 | 0.0336 |
240
+ | 0.0284 | 1.2 | 187 | 0.0338 |
241
+ | 0.029 | 1.2 | 188 | 0.0342 |
242
+ | 0.0299 | 1.21 | 189 | 0.0343 |
243
+ | 0.0317 | 1.22 | 190 | 0.0342 |
244
+ | 0.0319 | 1.22 | 191 | 0.0345 |
245
+ | 0.0281 | 1.23 | 192 | 0.0344 |
246
+ | 0.0293 | 1.24 | 193 | 0.0341 |
247
+ | 0.0256 | 1.24 | 194 | 0.0344 |
248
+ | 0.0253 | 1.25 | 195 | 0.0345 |
249
+ | 0.0255 | 1.25 | 196 | 0.0344 |
250
+ | 0.0238 | 1.26 | 197 | 0.0343 |
251
+ | 0.0247 | 1.27 | 198 | 0.0338 |
252
+ | 0.0239 | 1.27 | 199 | 0.0344 |
253
+ | 0.0248 | 1.28 | 200 | 0.0345 |
254
+ | 0.0253 | 1.29 | 201 | 0.0343 |
255
+ | 0.0252 | 1.29 | 202 | 0.0340 |
256
+ | 0.0275 | 1.3 | 203 | 0.0340 |
257
+ | 0.0376 | 1.31 | 204 | 0.0336 |
258
+ | 0.0374 | 1.31 | 205 | 0.0333 |
259
+ | 0.0397 | 1.32 | 206 | 0.0331 |
260
+ | 0.0361 | 1.32 | 207 | 0.0322 |
261
+ | 0.0402 | 1.33 | 208 | 0.0319 |
262
+ | 0.0307 | 1.34 | 209 | 0.0317 |
263
+ | 0.0305 | 1.34 | 210 | 0.0309 |
264
+ | 0.0285 | 1.35 | 211 | 0.0307 |
265
+ | 0.0301 | 1.36 | 212 | 0.0307 |
266
+ | 0.0298 | 1.36 | 213 | 0.0306 |
267
+ | 0.0278 | 1.37 | 214 | 0.0305 |
268
+ | 0.0283 | 1.38 | 215 | 0.0303 |
269
+ | 0.0311 | 1.38 | 216 | 0.0304 |
270
+ | 0.0314 | 1.39 | 217 | 0.0306 |
271
+ | 0.0301 | 1.4 | 218 | 0.0304 |
272
+ | 0.0301 | 1.4 | 219 | 0.0303 |
273
+ | 0.0297 | 1.41 | 220 | 0.0299 |
274
+ | 0.0298 | 1.41 | 221 | 0.0300 |
275
+ | 0.0316 | 1.42 | 222 | 0.0299 |
276
+ | 0.0258 | 1.43 | 223 | 0.0296 |
277
+ | 0.0297 | 1.43 | 224 | 0.0297 |
278
+ | 0.0307 | 1.44 | 225 | 0.0289 |
279
+ | 0.0256 | 1.45 | 226 | 0.0285 |
280
+ | 0.0291 | 1.45 | 227 | 0.0285 |
281
+ | 0.0295 | 1.46 | 228 | 0.0286 |
282
+ | 0.0263 | 1.47 | 229 | 0.0283 |
283
+ | 0.0301 | 1.47 | 230 | 0.0284 |
284
+ | 0.0289 | 1.48 | 231 | 0.0285 |
285
+ | 0.0272 | 1.48 | 232 | 0.0286 |
286
+ | 0.0297 | 1.49 | 233 | 0.0286 |
287
+ | 0.0261 | 1.5 | 234 | 0.0286 |
288
+ | 0.0254 | 1.5 | 235 | 0.0286 |
289
+ | 0.0298 | 1.51 | 236 | 0.0284 |
290
+ | 0.0329 | 1.52 | 237 | 0.0278 |
291
+ | 0.0325 | 1.52 | 238 | 0.0281 |
292
+ | 0.0297 | 1.53 | 239 | 0.0280 |
293
+ | 0.0274 | 1.54 | 240 | 0.0281 |
294
+ | 0.0291 | 1.54 | 241 | 0.0277 |
295
+ | 0.0271 | 1.55 | 242 | 0.0279 |
296
+ | 0.0283 | 1.56 | 243 | 0.0278 |
297
+ | 0.0258 | 1.56 | 244 | 0.0277 |
298
+ | 0.0271 | 1.57 | 245 | 0.0276 |
299
+ | 0.0279 | 1.57 | 246 | 0.0273 |
300
+ | 0.0282 | 1.58 | 247 | 0.0274 |
301
+ | 0.0286 | 1.59 | 248 | 0.0272 |
302
+ | 0.0248 | 1.59 | 249 | 0.0268 |
303
+ | 0.0268 | 1.6 | 250 | 0.0272 |
304
+ | 0.0239 | 1.61 | 251 | 0.0271 |
305
+ | 0.0321 | 1.61 | 252 | 0.0268 |
306
+ | 0.0305 | 1.62 | 253 | 0.0266 |
307
+ | 0.0307 | 1.63 | 254 | 0.0263 |
308
+ | 0.0245 | 1.63 | 255 | 0.0266 |
309
+ | 0.0261 | 1.64 | 256 | 0.0268 |
310
+ | 0.0264 | 1.64 | 257 | 0.0262 |
311
+ | 0.0268 | 1.65 | 258 | 0.0264 |
312
+ | 0.0253 | 1.66 | 259 | 0.0261 |
313
+ | 0.0267 | 1.66 | 260 | 0.0261 |
314
+ | 0.0276 | 1.67 | 261 | 0.0262 |
315
+ | 0.0269 | 1.68 | 262 | 0.0260 |
316
+ | 0.0265 | 1.68 | 263 | 0.0262 |
317
+ | 0.0267 | 1.69 | 264 | 0.0262 |
318
+ | 0.0256 | 1.7 | 265 | 0.0260 |
319
+ | 0.0285 | 1.7 | 266 | 0.0257 |
320
+ | 0.0305 | 1.71 | 267 | 0.0259 |
321
+ | 0.0302 | 1.72 | 268 | 0.0262 |
322
+ | 0.0294 | 1.72 | 269 | 0.0258 |
323
+ | 0.0295 | 1.73 | 270 | 0.0255 |
324
+ | 0.027 | 1.73 | 271 | 0.0255 |
325
+ | 0.0276 | 1.74 | 272 | 0.0256 |
326
+ | 0.0256 | 1.75 | 273 | 0.0257 |
327
+ | 0.0248 | 1.75 | 274 | 0.0257 |
328
+ | 0.0305 | 1.76 | 275 | 0.0253 |
329
+ | 0.029 | 1.77 | 276 | 0.0253 |
330
+ | 0.0296 | 1.77 | 277 | 0.0251 |
331
+ | 0.0307 | 1.78 | 278 | 0.0250 |
332
+ | 0.0247 | 1.79 | 279 | 0.0250 |
333
+ | 0.0235 | 1.79 | 280 | 0.0250 |
334
+ | 0.0267 | 1.8 | 281 | 0.0250 |
335
+ | 0.0278 | 1.8 | 282 | 0.0246 |
336
+ | 0.0277 | 1.81 | 283 | 0.0248 |
337
+ | 0.0266 | 1.82 | 284 | 0.0242 |
338
+ | 0.0232 | 1.82 | 285 | 0.0244 |
339
+ | 0.0288 | 1.83 | 286 | 0.0243 |
340
+ | 0.0276 | 1.84 | 287 | 0.0243 |
341
+ | 0.0295 | 1.84 | 288 | 0.0241 |
342
+ | 0.0253 | 1.85 | 289 | 0.0240 |
343
+ | 0.0267 | 1.86 | 290 | 0.0244 |
344
+ | 0.0286 | 1.86 | 291 | 0.0242 |
345
+ | 0.0272 | 1.87 | 292 | 0.0237 |
346
+ | 0.0267 | 1.88 | 293 | 0.0240 |
347
+ | 0.0251 | 1.88 | 294 | 0.0243 |
348
+ | 0.0257 | 1.89 | 295 | 0.0239 |
349
+ | 0.0283 | 1.89 | 296 | 0.0237 |
350
+ | 0.0246 | 1.9 | 297 | 0.0238 |
351
+ | 0.0241 | 1.91 | 298 | 0.0238 |
352
+ | 0.0259 | 1.91 | 299 | 0.0241 |
353
+ | 0.0246 | 1.92 | 300 | 0.0237 |
354
+ | 0.0253 | 1.93 | 301 | 0.0239 |
355
+ | 0.0257 | 1.93 | 302 | 0.0237 |
356
+ | 0.0232 | 1.94 | 303 | 0.0238 |
357
+ | 0.0241 | 1.95 | 304 | 0.0237 |
358
+ | 0.0254 | 1.95 | 305 | 0.0237 |
359
+ | 0.0227 | 1.96 | 306 | 0.0238 |
360
+ | 0.023 | 1.96 | 307 | 0.0238 |
361
+ | 0.0232 | 1.97 | 308 | 0.0238 |
362
+ | 0.0268 | 1.98 | 309 | 0.0234 |
363
+ | 0.027 | 1.98 | 310 | 0.0237 |
364
+ | 0.0243 | 1.99 | 311 | 0.0235 |
365
+ | 0.025 | 2.0 | 312 | 0.0235 |
366
+ | 0.0185 | 2.0 | 313 | 0.0235 |
367
+ | 0.0153 | 2.01 | 314 | 0.0234 |
368
+ | 0.0119 | 2.02 | 315 | 0.0236 |
369
+ | 0.0094 | 2.02 | 316 | 0.0235 |
370
+ | 0.0171 | 2.03 | 317 | 0.0237 |
371
+ | 0.0121 | 2.04 | 318 | 0.0235 |
372
+ | 0.0135 | 2.04 | 319 | 0.0231 |
373
+ | 0.0182 | 2.05 | 320 | 0.0235 |
374
+ | 0.0128 | 2.05 | 321 | 0.0232 |
375
+ | 0.014 | 2.06 | 322 | 0.0235 |
376
+ | 0.0142 | 2.07 | 323 | 0.0237 |
377
+ | 0.0084 | 2.07 | 324 | 0.0236 |
378
+ | 0.0137 | 2.08 | 325 | 0.0235 |
379
+ | 0.0144 | 2.09 | 326 | 0.0238 |
380
+ | 0.0114 | 2.09 | 327 | 0.0237 |
381
+ | 0.0104 | 2.1 | 328 | 0.0239 |
382
+ | 0.0205 | 2.11 | 329 | 0.0234 |
383
+ | 0.0234 | 2.11 | 330 | 0.0233 |
384
+ | 0.0227 | 2.12 | 331 | 0.0237 |
385
+ | 0.0217 | 2.12 | 332 | 0.0235 |
386
+ | 0.025 | 2.13 | 333 | 0.0237 |
387
+ | 0.0208 | 2.14 | 334 | 0.0245 |
388
+ | 0.0192 | 2.14 | 335 | 0.0245 |
389
+ | 0.0195 | 2.15 | 336 | 0.0249 |
390
+ | 0.0203 | 2.16 | 337 | 0.0253 |
391
+ | 0.0234 | 2.16 | 338 | 0.0252 |
392
+ | 0.0176 | 2.17 | 339 | 0.0259 |
393
+ | 0.018 | 2.18 | 340 | 0.0260 |
394
+ | 0.0188 | 2.18 | 341 | 0.0265 |
395
+ | 0.0198 | 2.19 | 342 | 0.0262 |
396
+ | 0.0172 | 2.2 | 343 | 0.0268 |
397
+ | 0.0184 | 2.2 | 344 | 0.0271 |
398
+ | 0.0192 | 2.21 | 345 | 0.0273 |
399
+ | 0.0203 | 2.21 | 346 | 0.0277 |
400
+ | 0.0221 | 2.22 | 347 | 0.0283 |
401
+ | 0.0189 | 2.23 | 348 | 0.0282 |
402
+ | 0.0196 | 2.23 | 349 | 0.0289 |
403
+ | 0.0205 | 2.24 | 350 | 0.0288 |
404
+ | 0.0168 | 2.25 | 351 | 0.0291 |
405
+ | 0.0176 | 2.25 | 352 | 0.0294 |
406
+ | 0.0165 | 2.26 | 353 | 0.0295 |
407
+ | 0.0148 | 2.27 | 354 | 0.0301 |
408
+ | 0.0178 | 2.27 | 355 | 0.0296 |
409
+ | 0.0163 | 2.28 | 356 | 0.0301 |
410
+ | 0.0203 | 2.28 | 357 | 0.0303 |
411
+ | 0.0163 | 2.29 | 358 | 0.0301 |
412
+ | 0.0183 | 2.3 | 359 | 0.0301 |
413
+ | 0.0266 | 2.3 | 360 | 0.0299 |
414
+ | 0.0295 | 2.31 | 361 | 0.0295 |
415
+ | 0.0293 | 2.32 | 362 | 0.0291 |
416
+ | 0.0248 | 2.32 | 363 | 0.0281 |
417
+ | 0.0285 | 2.33 | 364 | 0.0273 |
418
+ | 0.0252 | 2.34 | 365 | 0.0262 |
419
+ | 0.0222 | 2.34 | 366 | 0.0257 |
420
+ | 0.0208 | 2.35 | 367 | 0.0245 |
421
+ | 0.0194 | 2.36 | 368 | 0.0239 |
422
+ | 0.0217 | 2.36 | 369 | 0.0238 |
423
+ | 0.0183 | 2.37 | 370 | 0.0237 |
424
+ | 0.0218 | 2.37 | 371 | 0.0238 |
425
+ | 0.0224 | 2.38 | 372 | 0.0233 |
426
+ | 0.023 | 2.39 | 373 | 0.0235 |
427
+ | 0.0205 | 2.39 | 374 | 0.0235 |
428
+ | 0.0215 | 2.4 | 375 | 0.0237 |
429
+ | 0.0189 | 2.41 | 376 | 0.0238 |
430
+ | 0.0233 | 2.41 | 377 | 0.0236 |
431
+ | 0.0225 | 2.42 | 378 | 0.0238 |
432
+ | 0.0196 | 2.43 | 379 | 0.0233 |
433
+ | 0.0224 | 2.43 | 380 | 0.0232 |
434
+ | 0.0214 | 2.44 | 381 | 0.0234 |
435
+ | 0.0187 | 2.44 | 382 | 0.0233 |
436
+ | 0.0199 | 2.45 | 383 | 0.0231 |
437
+ | 0.0227 | 2.46 | 384 | 0.0231 |
438
+ | 0.0199 | 2.46 | 385 | 0.0231 |
439
+ | 0.0248 | 2.47 | 386 | 0.0231 |
440
+ | 0.0196 | 2.48 | 387 | 0.0231 |
441
+ | 0.0214 | 2.48 | 388 | 0.0231 |
442
+ | 0.022 | 2.49 | 389 | 0.0230 |
443
+ | 0.0201 | 2.5 | 390 | 0.0232 |
444
+ | 0.0205 | 2.5 | 391 | 0.0233 |
445
+ | 0.0221 | 2.51 | 392 | 0.0231 |
446
+ | 0.0255 | 2.52 | 393 | 0.0233 |
447
+ | 0.0235 | 2.52 | 394 | 0.0232 |
448
+ | 0.0235 | 2.53 | 395 | 0.0231 |
449
+ | 0.0237 | 2.53 | 396 | 0.0232 |
450
+ | 0.0216 | 2.54 | 397 | 0.0236 |
451
+ | 0.0229 | 2.55 | 398 | 0.0232 |
452
+ | 0.0191 | 2.55 | 399 | 0.0231 |
453
+ | 0.0211 | 2.56 | 400 | 0.0231 |
454
+ | 0.02 | 2.57 | 401 | 0.0232 |
455
+ | 0.0217 | 2.57 | 402 | 0.0228 |
456
+ | 0.0228 | 2.58 | 403 | 0.0228 |
457
+ | 0.0222 | 2.59 | 404 | 0.0230 |
458
+ | 0.0197 | 2.59 | 405 | 0.0226 |
459
+ | 0.0195 | 2.6 | 406 | 0.0225 |
460
+ | 0.0208 | 2.6 | 407 | 0.0227 |
461
+ | 0.0238 | 2.61 | 408 | 0.0226 |
462
+ | 0.0247 | 2.62 | 409 | 0.0222 |
463
+ | 0.0229 | 2.62 | 410 | 0.0223 |
464
+ | 0.0208 | 2.63 | 411 | 0.0222 |
465
+ | 0.0197 | 2.64 | 412 | 0.0220 |
466
+ | 0.0215 | 2.64 | 413 | 0.0222 |
467
+ | 0.0216 | 2.65 | 414 | 0.0221 |
468
+ | 0.0194 | 2.66 | 415 | 0.0221 |
469
+ | 0.0223 | 2.66 | 416 | 0.0220 |
470
+ | 0.022 | 2.67 | 417 | 0.0220 |
471
+ | 0.0204 | 2.68 | 418 | 0.0218 |
472
+ | 0.0211 | 2.68 | 419 | 0.0219 |
473
+ | 0.0205 | 2.69 | 420 | 0.0218 |
474
+ | 0.021 | 2.69 | 421 | 0.0213 |
475
+ | 0.0206 | 2.7 | 422 | 0.0216 |
476
+ | 0.0261 | 2.71 | 423 | 0.0215 |
477
+ | 0.0234 | 2.71 | 424 | 0.0216 |
478
+ | 0.0246 | 2.72 | 425 | 0.0215 |
479
+ | 0.0242 | 2.73 | 426 | 0.0217 |
480
+ | 0.0204 | 2.73 | 427 | 0.0217 |
481
+ | 0.0214 | 2.74 | 428 | 0.0215 |
482
+ | 0.0235 | 2.75 | 429 | 0.0216 |
483
+ | 0.0196 | 2.75 | 430 | 0.0213 |
484
+ | 0.0223 | 2.76 | 431 | 0.0213 |
485
+ | 0.0239 | 2.76 | 432 | 0.0211 |
486
+ | 0.0224 | 2.77 | 433 | 0.0210 |
487
+ | 0.025 | 2.78 | 434 | 0.0210 |
488
+ | 0.0228 | 2.78 | 435 | 0.0210 |
489
+ | 0.0182 | 2.79 | 436 | 0.0207 |
490
+ | 0.0214 | 2.8 | 437 | 0.0208 |
491
+ | 0.022 | 2.8 | 438 | 0.0212 |
492
+ | 0.0233 | 2.81 | 439 | 0.0205 |
493
+ | 0.0209 | 2.82 | 440 | 0.0208 |
494
+ | 0.0194 | 2.82 | 441 | 0.0206 |
495
+ | 0.0215 | 2.83 | 442 | 0.0205 |
496
+ | 0.0242 | 2.84 | 443 | 0.0204 |
497
+ | 0.0219 | 2.84 | 444 | 0.0205 |
498
+ | 0.0217 | 2.85 | 445 | 0.0205 |
499
+ | 0.0237 | 2.85 | 446 | 0.0205 |
500
+ | 0.0214 | 2.86 | 447 | 0.0205 |
501
+ | 0.0224 | 2.87 | 448 | 0.0201 |
502
+ | 0.0199 | 2.87 | 449 | 0.0206 |
503
+ | 0.0219 | 2.88 | 450 | 0.0202 |
504
+ | 0.022 | 2.89 | 451 | 0.0202 |
505
+ | 0.0224 | 2.89 | 452 | 0.0204 |
506
+ | 0.0214 | 2.9 | 453 | 0.0200 |
507
+ | 0.0185 | 2.91 | 454 | 0.0201 |
508
+ | 0.0219 | 2.91 | 455 | 0.0202 |
509
+ | 0.0204 | 2.92 | 456 | 0.0203 |
510
+ | 0.0198 | 2.92 | 457 | 0.0201 |
511
+ | 0.0214 | 2.93 | 458 | 0.0202 |
512
+ | 0.0184 | 2.94 | 459 | 0.0203 |
513
+ | 0.0191 | 2.94 | 460 | 0.0200 |
514
+ | 0.0221 | 2.95 | 461 | 0.0205 |
515
+ | 0.0192 | 2.96 | 462 | 0.0204 |
516
+ | 0.0186 | 2.96 | 463 | 0.0204 |
517
+ | 0.0176 | 2.97 | 464 | 0.0201 |
518
+ | 0.0209 | 2.98 | 465 | 0.0203 |
519
+ | 0.0233 | 2.98 | 466 | 0.0200 |
520
+ | 0.0189 | 2.99 | 467 | 0.0204 |
521
+ | 0.0214 | 3.0 | 468 | 0.0201 |
522
+
523
+
524
+ ### Framework versions
525
+
526
+ - Transformers 4.37.2
527
+ - Pytorch 2.1.0a0+32f93b1
528
+ - Datasets 2.17.1
529
+ - Tokenizers 0.15.2
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 100000,
4
+ "eos_token_id": 100001,
5
+ "transformers_version": "4.37.2"
6
+ }
model-00001-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c6eb832131b147ade2a43c58c7699af6997a9300458434a6a8f0e7835ec7d58
3
+ size 4987202208
model-00002-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d11e3bd7bb39274628f4f1364142e7912fa83611e7a49c2be59395bd9c6fcb72
3
+ size 4980945440
model-00003-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:47f3ddac35c237d477480843a4af18a9eabf39473f6efc5a4df0c729ed8aa645
3
+ size 3852615520
model.safetensors.index.json ADDED
@@ -0,0 +1,280 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 13820731392
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00003-of-00003.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00003.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00003.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00003.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00002-of-00003.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00002-of-00003.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00002-of-00003.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00003.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00002-of-00003.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00002-of-00003.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00002-of-00003.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00003.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00002-of-00003.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00002-of-00003.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00003.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00002-of-00003.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00002-of-00003.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00003-of-00003.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00003-of-00003.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00003-of-00003.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00003-of-00003.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00003-of-00003.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00003-of-00003.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00003-of-00003.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00003-of-00003.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00003.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
224
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00003.safetensors",
225
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
226
+ "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
227
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
228
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
229
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
230
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
231
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
232
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
233
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00003.safetensors",
234
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
235
+ "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
236
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
237
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
238
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
239
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
240
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
241
+ "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
242
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00003.safetensors",
243
+ "model.layers.6.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
244
+ "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
245
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
246
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
247
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
248
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
249
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
250
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
251
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00003.safetensors",
252
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
253
+ "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
254
+ "model.layers.7.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
255
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
256
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
257
+ "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
258
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
259
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
260
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00003.safetensors",
261
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
262
+ "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
263
+ "model.layers.8.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
264
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
265
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
266
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
267
+ "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
268
+ "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
269
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00003.safetensors",
270
+ "model.layers.9.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
271
+ "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
272
+ "model.layers.9.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
273
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
274
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
275
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
276
+ "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
277
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
278
+ "model.norm.weight": "model-00003-of-00003.safetensors"
279
+ }
280
+ }