ndiy commited on
Commit
4035715
·
verified ·
1 Parent(s): 471fb59

End of training

Browse files
README.md CHANGED
@@ -14,10 +14,10 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # ASPECT_SENT
16
 
17
- This model is a fine-tuned version of [techthiyanes/chinese_sentiment](https://huggingface.co/techthiyanes/chinese_sentiment) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 3.0006
20
- - Accuracy: 0.1399
21
 
22
  ## Model description
23
 
@@ -37,27 +37,20 @@ More information needed
37
 
38
  The following hyperparameters were used during training:
39
  - learning_rate: 3e-05
40
- - train_batch_size: 64
41
  - eval_batch_size: 8
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
- - num_epochs: 10
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
50
  |:-------------:|:-----:|:-----:|:---------------:|:--------:|
51
- | 3.0532 | 1.0 | 3376 | 3.0970 | 0.1370 |
52
- | 2.9743 | 2.0 | 6752 | 2.9974 | 0.1423 |
53
- | 2.9687 | 3.0 | 10128 | 2.9661 | 0.1427 |
54
- | 2.8626 | 4.0 | 13504 | 2.9584 | 0.1419 |
55
- | 2.834 | 5.0 | 16880 | 2.9616 | 0.1421 |
56
- | 2.7554 | 6.0 | 20256 | 2.9660 | 0.1426 |
57
- | 2.7024 | 7.0 | 23632 | 2.9738 | 0.1400 |
58
- | 2.6473 | 8.0 | 27008 | 2.9881 | 0.1402 |
59
- | 2.6103 | 9.0 | 30384 | 2.9958 | 0.1398 |
60
- | 2.5661 | 10.0 | 33760 | 3.0006 | 0.1399 |
61
 
62
 
63
  ### Framework versions
 
14
 
15
  # ASPECT_SENT
16
 
17
+ This model is a fine-tuned version of [techthiyanes/chinese_sentiment](https://huggingface.co/techthiyanes/chinese_sentiment) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.5445
20
+ - Accuracy: 0.7876
21
 
22
  ## Model description
23
 
 
37
 
38
  The following hyperparameters were used during training:
39
  - learning_rate: 3e-05
40
+ - train_batch_size: 32
41
  - eval_batch_size: 8
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - num_epochs: 3
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
50
  |:-------------:|:-----:|:-----:|:---------------:|:--------:|
51
+ | 0.5678 | 1.0 | 6668 | 0.5337 | 0.7804 |
52
+ | 0.404 | 2.0 | 13336 | 0.5267 | 0.7900 |
53
+ | 0.3276 | 3.0 | 20004 | 0.5445 | 0.7876 |
 
 
 
 
 
 
 
54
 
55
 
56
  ### Framework versions
config.json CHANGED
@@ -12,458 +12,14 @@
12
  "id2label": {
13
  "0": "LABEL_0",
14
  "1": "LABEL_1",
15
- "2": "LABEL_2",
16
- "3": "LABEL_3",
17
- "4": "LABEL_4",
18
- "5": "LABEL_5",
19
- "6": "LABEL_6",
20
- "7": "LABEL_7",
21
- "8": "LABEL_8",
22
- "9": "LABEL_9",
23
- "10": "LABEL_10",
24
- "11": "LABEL_11",
25
- "12": "LABEL_12",
26
- "13": "LABEL_13",
27
- "14": "LABEL_14",
28
- "15": "LABEL_15",
29
- "16": "LABEL_16",
30
- "17": "LABEL_17",
31
- "18": "LABEL_18",
32
- "19": "LABEL_19",
33
- "20": "LABEL_20",
34
- "21": "LABEL_21",
35
- "22": "LABEL_22",
36
- "23": "LABEL_23",
37
- "24": "LABEL_24",
38
- "25": "LABEL_25",
39
- "26": "LABEL_26",
40
- "27": "LABEL_27",
41
- "28": "LABEL_28",
42
- "29": "LABEL_29",
43
- "30": "LABEL_30",
44
- "31": "LABEL_31",
45
- "32": "LABEL_32",
46
- "33": "LABEL_33",
47
- "34": "LABEL_34",
48
- "35": "LABEL_35",
49
- "36": "LABEL_36",
50
- "37": "LABEL_37",
51
- "38": "LABEL_38",
52
- "39": "LABEL_39",
53
- "40": "LABEL_40",
54
- "41": "LABEL_41",
55
- "42": "LABEL_42",
56
- "43": "LABEL_43",
57
- "44": "LABEL_44",
58
- "45": "LABEL_45",
59
- "46": "LABEL_46",
60
- "47": "LABEL_47",
61
- "48": "LABEL_48",
62
- "49": "LABEL_49",
63
- "50": "LABEL_50",
64
- "51": "LABEL_51",
65
- "52": "LABEL_52",
66
- "53": "LABEL_53",
67
- "54": "LABEL_54",
68
- "55": "LABEL_55",
69
- "56": "LABEL_56",
70
- "57": "LABEL_57",
71
- "58": "LABEL_58",
72
- "59": "LABEL_59",
73
- "60": "LABEL_60",
74
- "61": "LABEL_61",
75
- "62": "LABEL_62",
76
- "63": "LABEL_63",
77
- "64": "LABEL_64",
78
- "65": "LABEL_65",
79
- "66": "LABEL_66",
80
- "67": "LABEL_67",
81
- "68": "LABEL_68",
82
- "69": "LABEL_69",
83
- "70": "LABEL_70",
84
- "71": "LABEL_71",
85
- "72": "LABEL_72",
86
- "73": "LABEL_73",
87
- "74": "LABEL_74",
88
- "75": "LABEL_75",
89
- "76": "LABEL_76",
90
- "77": "LABEL_77",
91
- "78": "LABEL_78",
92
- "79": "LABEL_79",
93
- "80": "LABEL_80",
94
- "81": "LABEL_81",
95
- "82": "LABEL_82",
96
- "83": "LABEL_83",
97
- "84": "LABEL_84",
98
- "85": "LABEL_85",
99
- "86": "LABEL_86",
100
- "87": "LABEL_87",
101
- "88": "LABEL_88",
102
- "89": "LABEL_89",
103
- "90": "LABEL_90",
104
- "91": "LABEL_91",
105
- "92": "LABEL_92",
106
- "93": "LABEL_93",
107
- "94": "LABEL_94",
108
- "95": "LABEL_95",
109
- "96": "LABEL_96",
110
- "97": "LABEL_97",
111
- "98": "LABEL_98",
112
- "99": "LABEL_99",
113
- "100": "LABEL_100",
114
- "101": "LABEL_101",
115
- "102": "LABEL_102",
116
- "103": "LABEL_103",
117
- "104": "LABEL_104",
118
- "105": "LABEL_105",
119
- "106": "LABEL_106",
120
- "107": "LABEL_107",
121
- "108": "LABEL_108",
122
- "109": "LABEL_109",
123
- "110": "LABEL_110",
124
- "111": "LABEL_111",
125
- "112": "LABEL_112",
126
- "113": "LABEL_113",
127
- "114": "LABEL_114",
128
- "115": "LABEL_115",
129
- "116": "LABEL_116",
130
- "117": "LABEL_117",
131
- "118": "LABEL_118",
132
- "119": "LABEL_119",
133
- "120": "LABEL_120",
134
- "121": "LABEL_121",
135
- "122": "LABEL_122",
136
- "123": "LABEL_123",
137
- "124": "LABEL_124",
138
- "125": "LABEL_125",
139
- "126": "LABEL_126",
140
- "127": "LABEL_127",
141
- "128": "LABEL_128",
142
- "129": "LABEL_129",
143
- "130": "LABEL_130",
144
- "131": "LABEL_131",
145
- "132": "LABEL_132",
146
- "133": "LABEL_133",
147
- "134": "LABEL_134",
148
- "135": "LABEL_135",
149
- "136": "LABEL_136",
150
- "137": "LABEL_137",
151
- "138": "LABEL_138",
152
- "139": "LABEL_139",
153
- "140": "LABEL_140",
154
- "141": "LABEL_141",
155
- "142": "LABEL_142",
156
- "143": "LABEL_143",
157
- "144": "LABEL_144",
158
- "145": "LABEL_145",
159
- "146": "LABEL_146",
160
- "147": "LABEL_147",
161
- "148": "LABEL_148",
162
- "149": "LABEL_149",
163
- "150": "LABEL_150",
164
- "151": "LABEL_151",
165
- "152": "LABEL_152",
166
- "153": "LABEL_153",
167
- "154": "LABEL_154",
168
- "155": "LABEL_155",
169
- "156": "LABEL_156",
170
- "157": "LABEL_157",
171
- "158": "LABEL_158",
172
- "159": "LABEL_159",
173
- "160": "LABEL_160",
174
- "161": "LABEL_161",
175
- "162": "LABEL_162",
176
- "163": "LABEL_163",
177
- "164": "LABEL_164",
178
- "165": "LABEL_165",
179
- "166": "LABEL_166",
180
- "167": "LABEL_167",
181
- "168": "LABEL_168",
182
- "169": "LABEL_169",
183
- "170": "LABEL_170",
184
- "171": "LABEL_171",
185
- "172": "LABEL_172",
186
- "173": "LABEL_173",
187
- "174": "LABEL_174",
188
- "175": "LABEL_175",
189
- "176": "LABEL_176",
190
- "177": "LABEL_177",
191
- "178": "LABEL_178",
192
- "179": "LABEL_179",
193
- "180": "LABEL_180",
194
- "181": "LABEL_181",
195
- "182": "LABEL_182",
196
- "183": "LABEL_183",
197
- "184": "LABEL_184",
198
- "185": "LABEL_185",
199
- "186": "LABEL_186",
200
- "187": "LABEL_187",
201
- "188": "LABEL_188",
202
- "189": "LABEL_189",
203
- "190": "LABEL_190",
204
- "191": "LABEL_191",
205
- "192": "LABEL_192",
206
- "193": "LABEL_193",
207
- "194": "LABEL_194",
208
- "195": "LABEL_195",
209
- "196": "LABEL_196",
210
- "197": "LABEL_197",
211
- "198": "LABEL_198",
212
- "199": "LABEL_199",
213
- "200": "LABEL_200",
214
- "201": "LABEL_201",
215
- "202": "LABEL_202",
216
- "203": "LABEL_203",
217
- "204": "LABEL_204",
218
- "205": "LABEL_205",
219
- "206": "LABEL_206",
220
- "207": "LABEL_207",
221
- "208": "LABEL_208",
222
- "209": "LABEL_209",
223
- "210": "LABEL_210",
224
- "211": "LABEL_211",
225
- "212": "LABEL_212",
226
- "213": "LABEL_213",
227
- "214": "LABEL_214",
228
- "215": "LABEL_215",
229
- "216": "LABEL_216",
230
- "217": "LABEL_217",
231
- "218": "LABEL_218",
232
- "219": "LABEL_219",
233
- "220": "LABEL_220",
234
- "221": "LABEL_221",
235
- "222": "LABEL_222",
236
- "223": "LABEL_223",
237
- "224": "LABEL_224"
238
  },
239
  "initializer_range": 0.02,
240
  "intermediate_size": 3072,
241
  "label2id": {
242
  "LABEL_0": 0,
243
  "LABEL_1": 1,
244
- "LABEL_10": 10,
245
- "LABEL_100": 100,
246
- "LABEL_101": 101,
247
- "LABEL_102": 102,
248
- "LABEL_103": 103,
249
- "LABEL_104": 104,
250
- "LABEL_105": 105,
251
- "LABEL_106": 106,
252
- "LABEL_107": 107,
253
- "LABEL_108": 108,
254
- "LABEL_109": 109,
255
- "LABEL_11": 11,
256
- "LABEL_110": 110,
257
- "LABEL_111": 111,
258
- "LABEL_112": 112,
259
- "LABEL_113": 113,
260
- "LABEL_114": 114,
261
- "LABEL_115": 115,
262
- "LABEL_116": 116,
263
- "LABEL_117": 117,
264
- "LABEL_118": 118,
265
- "LABEL_119": 119,
266
- "LABEL_12": 12,
267
- "LABEL_120": 120,
268
- "LABEL_121": 121,
269
- "LABEL_122": 122,
270
- "LABEL_123": 123,
271
- "LABEL_124": 124,
272
- "LABEL_125": 125,
273
- "LABEL_126": 126,
274
- "LABEL_127": 127,
275
- "LABEL_128": 128,
276
- "LABEL_129": 129,
277
- "LABEL_13": 13,
278
- "LABEL_130": 130,
279
- "LABEL_131": 131,
280
- "LABEL_132": 132,
281
- "LABEL_133": 133,
282
- "LABEL_134": 134,
283
- "LABEL_135": 135,
284
- "LABEL_136": 136,
285
- "LABEL_137": 137,
286
- "LABEL_138": 138,
287
- "LABEL_139": 139,
288
- "LABEL_14": 14,
289
- "LABEL_140": 140,
290
- "LABEL_141": 141,
291
- "LABEL_142": 142,
292
- "LABEL_143": 143,
293
- "LABEL_144": 144,
294
- "LABEL_145": 145,
295
- "LABEL_146": 146,
296
- "LABEL_147": 147,
297
- "LABEL_148": 148,
298
- "LABEL_149": 149,
299
- "LABEL_15": 15,
300
- "LABEL_150": 150,
301
- "LABEL_151": 151,
302
- "LABEL_152": 152,
303
- "LABEL_153": 153,
304
- "LABEL_154": 154,
305
- "LABEL_155": 155,
306
- "LABEL_156": 156,
307
- "LABEL_157": 157,
308
- "LABEL_158": 158,
309
- "LABEL_159": 159,
310
- "LABEL_16": 16,
311
- "LABEL_160": 160,
312
- "LABEL_161": 161,
313
- "LABEL_162": 162,
314
- "LABEL_163": 163,
315
- "LABEL_164": 164,
316
- "LABEL_165": 165,
317
- "LABEL_166": 166,
318
- "LABEL_167": 167,
319
- "LABEL_168": 168,
320
- "LABEL_169": 169,
321
- "LABEL_17": 17,
322
- "LABEL_170": 170,
323
- "LABEL_171": 171,
324
- "LABEL_172": 172,
325
- "LABEL_173": 173,
326
- "LABEL_174": 174,
327
- "LABEL_175": 175,
328
- "LABEL_176": 176,
329
- "LABEL_177": 177,
330
- "LABEL_178": 178,
331
- "LABEL_179": 179,
332
- "LABEL_18": 18,
333
- "LABEL_180": 180,
334
- "LABEL_181": 181,
335
- "LABEL_182": 182,
336
- "LABEL_183": 183,
337
- "LABEL_184": 184,
338
- "LABEL_185": 185,
339
- "LABEL_186": 186,
340
- "LABEL_187": 187,
341
- "LABEL_188": 188,
342
- "LABEL_189": 189,
343
- "LABEL_19": 19,
344
- "LABEL_190": 190,
345
- "LABEL_191": 191,
346
- "LABEL_192": 192,
347
- "LABEL_193": 193,
348
- "LABEL_194": 194,
349
- "LABEL_195": 195,
350
- "LABEL_196": 196,
351
- "LABEL_197": 197,
352
- "LABEL_198": 198,
353
- "LABEL_199": 199,
354
- "LABEL_2": 2,
355
- "LABEL_20": 20,
356
- "LABEL_200": 200,
357
- "LABEL_201": 201,
358
- "LABEL_202": 202,
359
- "LABEL_203": 203,
360
- "LABEL_204": 204,
361
- "LABEL_205": 205,
362
- "LABEL_206": 206,
363
- "LABEL_207": 207,
364
- "LABEL_208": 208,
365
- "LABEL_209": 209,
366
- "LABEL_21": 21,
367
- "LABEL_210": 210,
368
- "LABEL_211": 211,
369
- "LABEL_212": 212,
370
- "LABEL_213": 213,
371
- "LABEL_214": 214,
372
- "LABEL_215": 215,
373
- "LABEL_216": 216,
374
- "LABEL_217": 217,
375
- "LABEL_218": 218,
376
- "LABEL_219": 219,
377
- "LABEL_22": 22,
378
- "LABEL_220": 220,
379
- "LABEL_221": 221,
380
- "LABEL_222": 222,
381
- "LABEL_223": 223,
382
- "LABEL_224": 224,
383
- "LABEL_23": 23,
384
- "LABEL_24": 24,
385
- "LABEL_25": 25,
386
- "LABEL_26": 26,
387
- "LABEL_27": 27,
388
- "LABEL_28": 28,
389
- "LABEL_29": 29,
390
- "LABEL_3": 3,
391
- "LABEL_30": 30,
392
- "LABEL_31": 31,
393
- "LABEL_32": 32,
394
- "LABEL_33": 33,
395
- "LABEL_34": 34,
396
- "LABEL_35": 35,
397
- "LABEL_36": 36,
398
- "LABEL_37": 37,
399
- "LABEL_38": 38,
400
- "LABEL_39": 39,
401
- "LABEL_4": 4,
402
- "LABEL_40": 40,
403
- "LABEL_41": 41,
404
- "LABEL_42": 42,
405
- "LABEL_43": 43,
406
- "LABEL_44": 44,
407
- "LABEL_45": 45,
408
- "LABEL_46": 46,
409
- "LABEL_47": 47,
410
- "LABEL_48": 48,
411
- "LABEL_49": 49,
412
- "LABEL_5": 5,
413
- "LABEL_50": 50,
414
- "LABEL_51": 51,
415
- "LABEL_52": 52,
416
- "LABEL_53": 53,
417
- "LABEL_54": 54,
418
- "LABEL_55": 55,
419
- "LABEL_56": 56,
420
- "LABEL_57": 57,
421
- "LABEL_58": 58,
422
- "LABEL_59": 59,
423
- "LABEL_6": 6,
424
- "LABEL_60": 60,
425
- "LABEL_61": 61,
426
- "LABEL_62": 62,
427
- "LABEL_63": 63,
428
- "LABEL_64": 64,
429
- "LABEL_65": 65,
430
- "LABEL_66": 66,
431
- "LABEL_67": 67,
432
- "LABEL_68": 68,
433
- "LABEL_69": 69,
434
- "LABEL_7": 7,
435
- "LABEL_70": 70,
436
- "LABEL_71": 71,
437
- "LABEL_72": 72,
438
- "LABEL_73": 73,
439
- "LABEL_74": 74,
440
- "LABEL_75": 75,
441
- "LABEL_76": 76,
442
- "LABEL_77": 77,
443
- "LABEL_78": 78,
444
- "LABEL_79": 79,
445
- "LABEL_8": 8,
446
- "LABEL_80": 80,
447
- "LABEL_81": 81,
448
- "LABEL_82": 82,
449
- "LABEL_83": 83,
450
- "LABEL_84": 84,
451
- "LABEL_85": 85,
452
- "LABEL_86": 86,
453
- "LABEL_87": 87,
454
- "LABEL_88": 88,
455
- "LABEL_89": 89,
456
- "LABEL_9": 9,
457
- "LABEL_90": 90,
458
- "LABEL_91": 91,
459
- "LABEL_92": 92,
460
- "LABEL_93": 93,
461
- "LABEL_94": 94,
462
- "LABEL_95": 95,
463
- "LABEL_96": 96,
464
- "LABEL_97": 97,
465
- "LABEL_98": 98,
466
- "LABEL_99": 99
467
  },
468
  "layer_norm_eps": 1e-12,
469
  "max_position_embeddings": 512,
 
12
  "id2label": {
13
  "0": "LABEL_0",
14
  "1": "LABEL_1",
15
+ "2": "LABEL_2"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
  },
17
  "initializer_range": 0.02,
18
  "intermediate_size": 3072,
19
  "label2id": {
20
  "LABEL_0": 0,
21
  "LABEL_1": 1,
22
+ "LABEL_2": 2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
  },
24
  "layer_norm_eps": 1e-12,
25
  "max_position_embeddings": 512,
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:04f73bd904feee9317a4ab0e5685e4a15dc893e9a4008e9ce6f58f4ab863ee5b
3
- size 409786196
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:977260f8172675cb2f205c7bce6c10e82b7c8da77804dbf257cf22416d6ee5d2
3
+ size 409103316
runs/May13_10-42-56_d9a5d6f3716c/events.out.tfevents.1715596995.d9a5d6f3716c.24.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:02bd68989439bcd189dca8b17148d3f883bed9d499fc02be0cda9c426846b7b1
3
+ size 429506
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ea76e6b365924ae1598ba7da5bd31953f92e00d96fca83e07388719f40da6b1c
3
  size 4920
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9a9e884913346487b64f991638d8232fae6de699662a1a7533a69bd91814343f
3
  size 4920