cryptoque commited on
Commit
ced20e8
1 Parent(s): 895a6a3

End of training

Browse files
README.md CHANGED
@@ -22,7 +22,7 @@ model-index:
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.83
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,8 +32,8 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the GTZAN dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.5842
36
- - Accuracy: 0.83
37
 
38
  ## Model description
39
 
@@ -53,245 +53,39 @@ More information needed
53
 
54
  The following hyperparameters were used during training:
55
  - learning_rate: 5e-05
56
- - train_batch_size: 8
57
- - eval_batch_size: 8
58
  - seed: 42
 
 
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
  - lr_scheduler_warmup_ratio: 0.1
62
- - num_epochs: 10
63
  - mixed_precision_training: Native AMP
64
 
65
  ### Training results
66
 
67
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
68
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
69
- | 1.8531 | 0.04 | 5 | 1.8761 | 0.42 |
70
- | 1.812 | 0.09 | 10 | 1.8698 | 0.42 |
71
- | 1.753 | 0.13 | 15 | 1.8647 | 0.43 |
72
- | 1.8981 | 0.18 | 20 | 1.8612 | 0.43 |
73
- | 1.9093 | 0.22 | 25 | 1.8481 | 0.43 |
74
- | 1.7549 | 0.27 | 30 | 1.8362 | 0.44 |
75
- | 1.7611 | 0.31 | 35 | 1.8354 | 0.46 |
76
- | 1.8847 | 0.35 | 40 | 1.8073 | 0.46 |
77
- | 1.9125 | 0.4 | 45 | 1.7782 | 0.46 |
78
- | 1.7412 | 0.44 | 50 | 1.7509 | 0.46 |
79
- | 1.8151 | 0.49 | 55 | 1.7341 | 0.5 |
80
- | 1.8575 | 0.53 | 60 | 1.7421 | 0.51 |
81
- | 1.7021 | 0.58 | 65 | 1.7142 | 0.48 |
82
- | 1.754 | 0.62 | 70 | 1.6808 | 0.47 |
83
- | 1.8043 | 0.66 | 75 | 1.6533 | 0.49 |
84
- | 1.7412 | 0.71 | 80 | 1.6206 | 0.52 |
85
- | 1.7192 | 0.75 | 85 | 1.6176 | 0.57 |
86
- | 1.6695 | 0.8 | 90 | 1.5824 | 0.56 |
87
- | 1.6737 | 0.84 | 95 | 1.5538 | 0.55 |
88
- | 1.5608 | 0.88 | 100 | 1.5354 | 0.58 |
89
- | 1.5808 | 0.93 | 105 | 1.4885 | 0.61 |
90
- | 1.6079 | 0.97 | 110 | 1.4487 | 0.64 |
91
- | 1.4 | 1.02 | 115 | 1.4126 | 0.62 |
92
- | 1.4354 | 1.06 | 120 | 1.4111 | 0.58 |
93
- | 1.4721 | 1.11 | 125 | 1.5071 | 0.59 |
94
- | 1.5932 | 1.15 | 130 | 1.4401 | 0.58 |
95
- | 1.55 | 1.19 | 135 | 1.3421 | 0.65 |
96
- | 1.4736 | 1.24 | 140 | 1.3357 | 0.66 |
97
- | 1.2953 | 1.28 | 145 | 1.3079 | 0.64 |
98
- | 1.447 | 1.33 | 150 | 1.2871 | 0.65 |
99
- | 1.2568 | 1.37 | 155 | 1.3049 | 0.66 |
100
- | 1.2429 | 1.42 | 160 | 1.3007 | 0.66 |
101
- | 1.2131 | 1.46 | 165 | 1.2711 | 0.62 |
102
- | 1.2334 | 1.5 | 170 | 1.1946 | 0.67 |
103
- | 1.259 | 1.55 | 175 | 1.2145 | 0.64 |
104
- | 1.1372 | 1.59 | 180 | 1.1661 | 0.67 |
105
- | 1.2354 | 1.64 | 185 | 1.1491 | 0.68 |
106
- | 1.2887 | 1.68 | 190 | 1.2242 | 0.68 |
107
- | 1.0687 | 1.73 | 195 | 1.0990 | 0.69 |
108
- | 1.1841 | 1.77 | 200 | 1.1206 | 0.68 |
109
- | 1.1771 | 1.81 | 205 | 1.1175 | 0.66 |
110
- | 1.2318 | 1.86 | 210 | 1.1317 | 0.65 |
111
- | 1.0428 | 1.9 | 215 | 1.2576 | 0.61 |
112
- | 1.2884 | 1.95 | 220 | 1.1615 | 0.67 |
113
- | 1.1047 | 1.99 | 225 | 1.0426 | 0.73 |
114
- | 1.1997 | 2.04 | 230 | 1.0549 | 0.66 |
115
- | 1.017 | 2.08 | 235 | 1.0164 | 0.7 |
116
- | 1.0552 | 2.12 | 240 | 1.0320 | 0.73 |
117
- | 0.9996 | 2.17 | 245 | 0.9538 | 0.73 |
118
- | 0.8955 | 2.21 | 250 | 0.9836 | 0.65 |
119
- | 0.9589 | 2.26 | 255 | 0.9806 | 0.68 |
120
- | 0.9798 | 2.3 | 260 | 1.0008 | 0.71 |
121
- | 1.0399 | 2.35 | 265 | 0.9794 | 0.74 |
122
- | 1.0752 | 2.39 | 270 | 0.9904 | 0.75 |
123
- | 0.9932 | 2.43 | 275 | 0.9367 | 0.76 |
124
- | 0.8616 | 2.48 | 280 | 0.9559 | 0.72 |
125
- | 0.9397 | 2.52 | 285 | 0.9290 | 0.74 |
126
- | 0.723 | 2.57 | 290 | 0.8597 | 0.78 |
127
- | 0.744 | 2.61 | 295 | 0.8666 | 0.76 |
128
- | 1.0076 | 2.65 | 300 | 0.9164 | 0.78 |
129
- | 0.9082 | 2.7 | 305 | 0.8460 | 0.75 |
130
- | 0.8282 | 2.74 | 310 | 0.8389 | 0.75 |
131
- | 0.9619 | 2.79 | 315 | 0.9278 | 0.71 |
132
- | 1.1567 | 2.83 | 320 | 0.9373 | 0.71 |
133
- | 0.7183 | 2.88 | 325 | 0.8854 | 0.74 |
134
- | 0.8553 | 2.92 | 330 | 0.9190 | 0.7 |
135
- | 0.9772 | 2.96 | 335 | 0.9902 | 0.72 |
136
- | 0.8981 | 3.01 | 340 | 0.8525 | 0.72 |
137
- | 0.714 | 3.05 | 345 | 0.8258 | 0.76 |
138
- | 0.7801 | 3.1 | 350 | 0.8276 | 0.74 |
139
- | 0.8413 | 3.14 | 355 | 0.8275 | 0.79 |
140
- | 0.6662 | 3.19 | 360 | 0.8478 | 0.78 |
141
- | 0.7231 | 3.23 | 365 | 0.8143 | 0.77 |
142
- | 0.9157 | 3.27 | 370 | 0.9178 | 0.77 |
143
- | 0.65 | 3.32 | 375 | 0.7662 | 0.76 |
144
- | 0.5838 | 3.36 | 380 | 0.7915 | 0.75 |
145
- | 0.7833 | 3.41 | 385 | 0.9211 | 0.78 |
146
- | 0.6006 | 3.45 | 390 | 0.8560 | 0.76 |
147
- | 0.7107 | 3.5 | 395 | 0.7816 | 0.79 |
148
- | 0.7087 | 3.54 | 400 | 0.8082 | 0.75 |
149
- | 0.6937 | 3.58 | 405 | 0.9886 | 0.75 |
150
- | 0.5178 | 3.63 | 410 | 0.8433 | 0.73 |
151
- | 0.8805 | 3.67 | 415 | 0.7730 | 0.74 |
152
- | 0.508 | 3.72 | 420 | 0.7592 | 0.76 |
153
- | 0.7176 | 3.76 | 425 | 0.7985 | 0.77 |
154
- | 0.9208 | 3.81 | 430 | 0.7707 | 0.79 |
155
- | 0.5551 | 3.85 | 435 | 0.7049 | 0.82 |
156
- | 0.7906 | 3.89 | 440 | 0.6906 | 0.8 |
157
- | 0.6059 | 3.94 | 445 | 0.7218 | 0.75 |
158
- | 0.5223 | 3.98 | 450 | 0.7037 | 0.8 |
159
- | 0.6692 | 4.03 | 455 | 0.7426 | 0.76 |
160
- | 0.6134 | 4.07 | 460 | 0.7388 | 0.75 |
161
- | 0.6148 | 4.12 | 465 | 0.7697 | 0.75 |
162
- | 0.6798 | 4.16 | 470 | 0.7617 | 0.76 |
163
- | 0.6189 | 4.2 | 475 | 0.7694 | 0.73 |
164
- | 0.4468 | 4.25 | 480 | 0.7357 | 0.75 |
165
- | 0.5356 | 4.29 | 485 | 0.7982 | 0.73 |
166
- | 0.8247 | 4.34 | 490 | 0.7061 | 0.77 |
167
- | 0.5487 | 4.38 | 495 | 0.7365 | 0.79 |
168
- | 0.5388 | 4.42 | 500 | 0.8114 | 0.75 |
169
- | 0.5091 | 4.47 | 505 | 0.7090 | 0.79 |
170
- | 0.5163 | 4.51 | 510 | 0.6481 | 0.8 |
171
- | 0.4583 | 4.56 | 515 | 0.6730 | 0.82 |
172
- | 0.6146 | 4.6 | 520 | 0.6506 | 0.82 |
173
- | 0.5108 | 4.65 | 525 | 0.6559 | 0.8 |
174
- | 0.5093 | 4.69 | 530 | 0.6740 | 0.8 |
175
- | 0.461 | 4.73 | 535 | 0.6804 | 0.78 |
176
- | 0.4242 | 4.78 | 540 | 0.9112 | 0.75 |
177
- | 0.406 | 4.82 | 545 | 0.7963 | 0.77 |
178
- | 0.4935 | 4.87 | 550 | 0.6577 | 0.82 |
179
- | 0.3039 | 4.91 | 555 | 0.7185 | 0.78 |
180
- | 0.5397 | 4.96 | 560 | 0.7497 | 0.79 |
181
- | 0.3281 | 5.0 | 565 | 0.6407 | 0.84 |
182
- | 0.369 | 5.04 | 570 | 0.6205 | 0.78 |
183
- | 0.5637 | 5.09 | 575 | 0.6138 | 0.82 |
184
- | 0.348 | 5.13 | 580 | 0.6837 | 0.81 |
185
- | 0.3551 | 5.18 | 585 | 0.6684 | 0.81 |
186
- | 0.5671 | 5.22 | 590 | 0.5909 | 0.83 |
187
- | 0.4109 | 5.27 | 595 | 0.6191 | 0.81 |
188
- | 0.5051 | 5.31 | 600 | 0.6868 | 0.78 |
189
- | 0.4184 | 5.35 | 605 | 0.6615 | 0.8 |
190
- | 0.3283 | 5.4 | 610 | 0.6240 | 0.83 |
191
- | 0.4155 | 5.44 | 615 | 0.6264 | 0.83 |
192
- | 0.3777 | 5.49 | 620 | 0.6237 | 0.83 |
193
- | 0.1838 | 5.53 | 625 | 0.7052 | 0.79 |
194
- | 0.305 | 5.58 | 630 | 0.6756 | 0.79 |
195
- | 0.4242 | 5.62 | 635 | 0.6332 | 0.79 |
196
- | 0.2501 | 5.66 | 640 | 0.6196 | 0.8 |
197
- | 0.319 | 5.71 | 645 | 0.5969 | 0.81 |
198
- | 0.355 | 5.75 | 650 | 0.6731 | 0.79 |
199
- | 0.3834 | 5.8 | 655 | 0.6952 | 0.74 |
200
- | 0.3246 | 5.84 | 660 | 0.7445 | 0.77 |
201
- | 0.27 | 5.88 | 665 | 0.7348 | 0.77 |
202
- | 0.3062 | 5.93 | 670 | 0.6162 | 0.82 |
203
- | 0.3608 | 5.97 | 675 | 0.5779 | 0.82 |
204
- | 0.1759 | 6.02 | 680 | 0.5926 | 0.83 |
205
- | 0.2594 | 6.06 | 685 | 0.7136 | 0.76 |
206
- | 0.3159 | 6.11 | 690 | 0.7404 | 0.77 |
207
- | 0.335 | 6.15 | 695 | 0.6222 | 0.8 |
208
- | 0.2409 | 6.19 | 700 | 0.6046 | 0.82 |
209
- | 0.2418 | 6.24 | 705 | 0.6096 | 0.83 |
210
- | 0.2934 | 6.28 | 710 | 0.6439 | 0.8 |
211
- | 0.3318 | 6.33 | 715 | 0.6253 | 0.83 |
212
- | 0.2237 | 6.37 | 720 | 0.6279 | 0.86 |
213
- | 0.2127 | 6.42 | 725 | 0.5995 | 0.82 |
214
- | 0.1976 | 6.46 | 730 | 0.6166 | 0.81 |
215
- | 0.3633 | 6.5 | 735 | 0.6673 | 0.8 |
216
- | 0.2623 | 6.55 | 740 | 0.6785 | 0.79 |
217
- | 0.4048 | 6.59 | 745 | 0.6219 | 0.8 |
218
- | 0.2785 | 6.64 | 750 | 0.5921 | 0.83 |
219
- | 0.1547 | 6.68 | 755 | 0.5978 | 0.83 |
220
- | 0.1684 | 6.73 | 760 | 0.5948 | 0.82 |
221
- | 0.1275 | 6.77 | 765 | 0.6284 | 0.77 |
222
- | 0.3332 | 6.81 | 770 | 0.5986 | 0.82 |
223
- | 0.1274 | 6.86 | 775 | 0.5664 | 0.8 |
224
- | 0.2948 | 6.9 | 780 | 0.5906 | 0.83 |
225
- | 0.3332 | 6.95 | 785 | 0.6109 | 0.83 |
226
- | 0.2647 | 6.99 | 790 | 0.6279 | 0.8 |
227
- | 0.2612 | 7.04 | 795 | 0.7166 | 0.81 |
228
- | 0.1695 | 7.08 | 800 | 0.5892 | 0.84 |
229
- | 0.3276 | 7.12 | 805 | 0.5794 | 0.8 |
230
- | 0.2943 | 7.17 | 810 | 0.5662 | 0.82 |
231
- | 0.2017 | 7.21 | 815 | 0.5683 | 0.82 |
232
- | 0.2097 | 7.26 | 820 | 0.5817 | 0.86 |
233
- | 0.1614 | 7.3 | 825 | 0.5457 | 0.86 |
234
- | 0.1219 | 7.35 | 830 | 0.5288 | 0.87 |
235
- | 0.1221 | 7.39 | 835 | 0.5386 | 0.87 |
236
- | 0.2657 | 7.43 | 840 | 0.5598 | 0.84 |
237
- | 0.1619 | 7.48 | 845 | 0.5654 | 0.82 |
238
- | 0.1306 | 7.52 | 850 | 0.5876 | 0.83 |
239
- | 0.1529 | 7.57 | 855 | 0.6047 | 0.81 |
240
- | 0.1625 | 7.61 | 860 | 0.5949 | 0.8 |
241
- | 0.189 | 7.65 | 865 | 0.5648 | 0.83 |
242
- | 0.1553 | 7.7 | 870 | 0.5584 | 0.85 |
243
- | 0.1318 | 7.74 | 875 | 0.5779 | 0.85 |
244
- | 0.0816 | 7.79 | 880 | 0.6156 | 0.83 |
245
- | 0.2986 | 7.83 | 885 | 0.6317 | 0.83 |
246
- | 0.1464 | 7.88 | 890 | 0.6120 | 0.83 |
247
- | 0.1597 | 7.92 | 895 | 0.6061 | 0.83 |
248
- | 0.1721 | 7.96 | 900 | 0.6255 | 0.8 |
249
- | 0.1271 | 8.01 | 905 | 0.6463 | 0.81 |
250
- | 0.1193 | 8.05 | 910 | 0.6313 | 0.82 |
251
- | 0.1354 | 8.1 | 915 | 0.5775 | 0.83 |
252
- | 0.109 | 8.14 | 920 | 0.5591 | 0.83 |
253
- | 0.0734 | 8.19 | 925 | 0.5574 | 0.84 |
254
- | 0.2419 | 8.23 | 930 | 0.5491 | 0.84 |
255
- | 0.1374 | 8.27 | 935 | 0.5664 | 0.84 |
256
- | 0.1205 | 8.32 | 940 | 0.5834 | 0.85 |
257
- | 0.2701 | 8.36 | 945 | 0.5968 | 0.85 |
258
- | 0.1315 | 8.41 | 950 | 0.5870 | 0.84 |
259
- | 0.0797 | 8.45 | 955 | 0.5925 | 0.82 |
260
- | 0.1462 | 8.5 | 960 | 0.5856 | 0.8 |
261
- | 0.0929 | 8.54 | 965 | 0.6031 | 0.8 |
262
- | 0.0899 | 8.58 | 970 | 0.6265 | 0.82 |
263
- | 0.1678 | 8.63 | 975 | 0.6082 | 0.82 |
264
- | 0.1362 | 8.67 | 980 | 0.5944 | 0.82 |
265
- | 0.114 | 8.72 | 985 | 0.6119 | 0.82 |
266
- | 0.083 | 8.76 | 990 | 0.6308 | 0.81 |
267
- | 0.2738 | 8.81 | 995 | 0.6119 | 0.82 |
268
- | 0.1005 | 8.85 | 1000 | 0.5878 | 0.82 |
269
- | 0.1127 | 8.89 | 1005 | 0.5896 | 0.83 |
270
- | 0.0551 | 8.94 | 1010 | 0.5912 | 0.83 |
271
- | 0.1891 | 8.98 | 1015 | 0.6001 | 0.83 |
272
- | 0.0997 | 9.03 | 1020 | 0.6005 | 0.83 |
273
- | 0.1219 | 9.07 | 1025 | 0.6108 | 0.84 |
274
- | 0.0608 | 9.12 | 1030 | 0.6233 | 0.82 |
275
- | 0.1651 | 9.16 | 1035 | 0.6122 | 0.83 |
276
- | 0.0557 | 9.2 | 1040 | 0.6035 | 0.81 |
277
- | 0.2139 | 9.25 | 1045 | 0.6026 | 0.83 |
278
- | 0.0682 | 9.29 | 1050 | 0.6005 | 0.83 |
279
- | 0.174 | 9.34 | 1055 | 0.5982 | 0.83 |
280
- | 0.2703 | 9.38 | 1060 | 0.5981 | 0.84 |
281
- | 0.1075 | 9.42 | 1065 | 0.5893 | 0.84 |
282
- | 0.108 | 9.47 | 1070 | 0.5842 | 0.84 |
283
- | 0.0669 | 9.51 | 1075 | 0.5849 | 0.84 |
284
- | 0.0785 | 9.56 | 1080 | 0.5846 | 0.84 |
285
- | 0.1578 | 9.6 | 1085 | 0.5868 | 0.84 |
286
- | 0.1029 | 9.65 | 1090 | 0.5893 | 0.84 |
287
- | 0.0587 | 9.69 | 1095 | 0.5886 | 0.84 |
288
- | 0.0738 | 9.73 | 1100 | 0.5849 | 0.84 |
289
- | 0.1114 | 9.78 | 1105 | 0.5821 | 0.84 |
290
- | 0.0642 | 9.82 | 1110 | 0.5828 | 0.84 |
291
- | 0.0759 | 9.87 | 1115 | 0.5839 | 0.84 |
292
- | 0.0716 | 9.91 | 1120 | 0.5847 | 0.83 |
293
- | 0.0515 | 9.96 | 1125 | 0.5844 | 0.83 |
294
- | 0.0719 | 10.0 | 1130 | 0.5842 | 0.83 |
295
 
296
 
297
  ### Framework versions
 
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.84
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the GTZAN dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 0.6195
36
+ - Accuracy: 0.84
37
 
38
  ## Model description
39
 
 
53
 
54
  The following hyperparameters were used during training:
55
  - learning_rate: 5e-05
56
+ - train_batch_size: 2
57
+ - eval_batch_size: 2
58
  - seed: 42
59
+ - gradient_accumulation_steps: 10
60
+ - total_train_batch_size: 20
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
+ - num_epochs: 2
65
  - mixed_precision_training: Native AMP
66
 
67
  ### Training results
68
 
69
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
70
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
71
+ | 0.1191 | 0.11 | 5 | 0.6123 | 0.84 |
72
+ | 0.1533 | 0.22 | 10 | 0.6686 | 0.81 |
73
+ | 0.1185 | 0.33 | 15 | 0.6119 | 0.82 |
74
+ | 0.1053 | 0.44 | 20 | 0.6488 | 0.82 |
75
+ | 0.1608 | 0.56 | 25 | 0.7321 | 0.82 |
76
+ | 0.1185 | 0.67 | 30 | 0.5918 | 0.88 |
77
+ | 0.1096 | 0.78 | 35 | 0.5839 | 0.82 |
78
+ | 0.1207 | 0.89 | 40 | 0.5980 | 0.84 |
79
+ | 0.0845 | 1.0 | 45 | 0.6426 | 0.86 |
80
+ | 0.1001 | 1.11 | 50 | 0.6154 | 0.84 |
81
+ | 0.083 | 1.22 | 55 | 0.5982 | 0.85 |
82
+ | 0.0564 | 1.33 | 60 | 0.5937 | 0.87 |
83
+ | 0.0935 | 1.44 | 65 | 0.6189 | 0.87 |
84
+ | 0.0591 | 1.56 | 70 | 0.5973 | 0.84 |
85
+ | 0.0951 | 1.67 | 75 | 0.6225 | 0.84 |
86
+ | 0.0484 | 1.78 | 80 | 0.6454 | 0.81 |
87
+ | 0.0621 | 1.89 | 85 | 0.6284 | 0.83 |
88
+ | 0.1254 | 2.0 | 90 | 0.6195 | 0.84 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
89
 
90
 
91
  ### Framework versions
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "ntu-spml/distilhubert",
3
  "activation_dropout": 0.1,
4
  "apply_spec_augment": false,
5
  "architectures": [
 
1
  {
2
+ "_name_or_path": "distilhubert-finetuned-gtzan/checkpoint-1000",
3
  "activation_dropout": 0.1,
4
  "apply_spec_augment": false,
5
  "architectures": [
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:311350e53901944b8290260233829ce3c81da378ce75070b72ffbd75f0f510eb
3
  size 94771728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:169d77ae050067bde8464e22e8be3eca457f3175c62a7591826260f6624f4d91
3
  size 94771728
runs/Jan26_05-46-39_d759b9d8f8ea/events.out.tfevents.1706248676.d759b9d8f8ea.13261.4 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b83f0ca912a1379411e3809fec0e5bd48a4f9c92c639b2f10089e5dabe943a19
3
+ size 14481
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ee78752708ba761309f71cc209c42873592206752fb7003c64deaf273fd3d082
3
  size 4728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4e5e2e74a0c002b36e22589288ba9a4bf81f2b0b6e893c265985a726ecfb2c60
3
  size 4728