Edit model card

scenario-KD-PR-MSV-D2_data-AmazonScience_massive_all_1_155

This model is a fine-tuned version of haryoaw/scenario-MDBT-TCR_data-AmazonScience_massive_all_1_1 on the massive dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4402
  • Accuracy: 0.8575
  • F1: 0.8343

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 55
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.4282 0.27 5000 1.5913 0.8176 0.7771
1.261 0.53 10000 1.5198 0.8303 0.7951
1.2127 0.8 15000 1.4862 0.8367 0.8061
1.0927 1.07 20000 1.4876 0.8356 0.8032
1.0692 1.34 25000 1.4776 0.8376 0.8041
1.0537 1.6 30000 1.4714 0.8408 0.8101
1.0357 1.87 35000 1.4689 0.8389 0.8136
0.9737 2.14 40000 1.4509 0.8482 0.8270
0.9719 2.41 45000 1.4763 0.8410 0.8185
0.9692 2.67 50000 1.4650 0.8437 0.8210
0.978 2.94 55000 1.4563 0.8473 0.8266
0.9229 3.21 60000 1.4699 0.8437 0.8193
0.9323 3.47 65000 1.4563 0.8485 0.8247
0.9274 3.74 70000 1.4588 0.8469 0.8205
0.9177 4.01 75000 1.4698 0.8447 0.8227
0.8939 4.28 80000 1.4774 0.8452 0.8227
0.8926 4.54 85000 1.4697 0.8460 0.8185
0.8973 4.81 90000 1.4907 0.8414 0.8182
0.8696 5.08 95000 1.4705 0.8469 0.8217
0.8784 5.34 100000 1.4698 0.8467 0.8211
0.8803 5.61 105000 1.4622 0.8490 0.8240
0.8849 5.88 110000 1.4692 0.8467 0.8198
0.8614 6.15 115000 1.4861 0.8435 0.8138
0.8613 6.41 120000 1.4682 0.8489 0.8278
0.8704 6.68 125000 1.4696 0.8475 0.8284
0.8687 6.95 130000 1.4723 0.8470 0.8252
0.8469 7.22 135000 1.4750 0.8471 0.8252
0.858 7.48 140000 1.4675 0.8485 0.8221
0.8557 7.75 145000 1.4754 0.8462 0.8234
0.8428 8.02 150000 1.4801 0.8469 0.8240
0.8397 8.28 155000 1.4871 0.8462 0.8247
0.8342 8.55 160000 1.4838 0.8471 0.8237
0.8462 8.82 165000 1.4622 0.8474 0.8225
0.8249 9.09 170000 1.4768 0.8483 0.8270
0.8259 9.35 175000 1.4950 0.8443 0.8186
0.8304 9.62 180000 1.4682 0.8510 0.8261
0.8342 9.89 185000 1.4754 0.8480 0.8216
0.8165 10.15 190000 1.4817 0.8473 0.8253
0.8229 10.42 195000 1.4884 0.8463 0.8271
0.8237 10.69 200000 1.4906 0.8448 0.8207
0.8275 10.96 205000 1.4803 0.8470 0.8237
0.8124 11.22 210000 1.4765 0.8491 0.8304
0.8144 11.49 215000 1.5053 0.8446 0.8177
0.8146 11.76 220000 1.4706 0.8495 0.8261
0.8063 12.03 225000 1.4681 0.8526 0.8282
0.8141 12.29 230000 1.4725 0.8493 0.8286
0.8098 12.56 235000 1.4612 0.8511 0.8257
0.8159 12.83 240000 1.4833 0.8477 0.8226
0.805 13.09 245000 1.4828 0.8480 0.8235
0.8082 13.36 250000 1.4919 0.8456 0.8185
0.8146 13.63 255000 1.4872 0.8456 0.8217
0.8077 13.9 260000 1.4754 0.8492 0.8209
0.8027 14.16 265000 1.4714 0.8507 0.8257
0.807 14.43 270000 1.4948 0.8441 0.8234
0.8015 14.7 275000 1.4791 0.8497 0.8256
0.8018 14.96 280000 1.4805 0.8487 0.8290
0.8005 15.23 285000 1.4642 0.8523 0.8312
0.799 15.5 290000 1.4692 0.8522 0.8328
0.7994 15.77 295000 1.4783 0.8494 0.8285
0.7942 16.03 300000 1.4749 0.8507 0.8299
0.7924 16.3 305000 1.4702 0.8527 0.8301
0.7978 16.57 310000 1.4882 0.8482 0.8228
0.7953 16.84 315000 1.4707 0.8514 0.8269
0.7867 17.1 320000 1.4929 0.8482 0.8274
0.7888 17.37 325000 1.4731 0.8509 0.8272
0.7938 17.64 330000 1.4739 0.8511 0.8300
0.7903 17.9 335000 1.4537 0.8539 0.8306
0.7876 18.17 340000 1.4700 0.8516 0.8302
0.7902 18.44 345000 1.4813 0.8490 0.8238
0.7894 18.71 350000 1.4617 0.8542 0.8291
0.7872 18.97 355000 1.4713 0.8508 0.8272
0.7866 19.24 360000 1.4712 0.8510 0.8313
0.7828 19.51 365000 1.4642 0.8526 0.8305
0.7873 19.77 370000 1.4590 0.8533 0.8298
0.7781 20.04 375000 1.4681 0.8532 0.8287
0.7783 20.31 380000 1.4707 0.8524 0.8305
0.7851 20.58 385000 1.4626 0.8538 0.8300
0.7845 20.84 390000 1.4547 0.8543 0.8295
0.783 21.11 395000 1.4627 0.8537 0.8309
0.7783 21.38 400000 1.4627 0.8542 0.8324
0.7842 21.65 405000 1.4707 0.8510 0.8261
0.7816 21.91 410000 1.4629 0.8533 0.8298
0.7779 22.18 415000 1.4567 0.8536 0.8262
0.7816 22.45 420000 1.4574 0.8549 0.8326
0.7762 22.71 425000 1.4661 0.8533 0.8311
0.7808 22.98 430000 1.4623 0.8541 0.8306
0.7761 23.25 435000 1.4596 0.8542 0.8297
0.7745 23.52 440000 1.4608 0.8539 0.8296
0.7801 23.78 445000 1.4517 0.8555 0.8305
0.7705 24.05 450000 1.4598 0.8552 0.8327
0.7761 24.32 455000 1.4540 0.8547 0.8296
0.7771 24.58 460000 1.4568 0.8543 0.8308
0.7805 24.85 465000 1.4582 0.8540 0.8306
0.7707 25.12 470000 1.4573 0.8548 0.8311
0.7752 25.39 475000 1.4529 0.8551 0.8329
0.7771 25.65 480000 1.4570 0.8542 0.8317
0.7757 25.92 485000 1.4532 0.8556 0.8333
0.7717 26.19 490000 1.4548 0.8546 0.8303
0.7748 26.46 495000 1.4474 0.8565 0.8335
0.7741 26.72 500000 1.4517 0.8548 0.8311
0.7738 26.99 505000 1.4505 0.8555 0.8330
0.7722 27.26 510000 1.4503 0.8549 0.8307
0.7709 27.52 515000 1.4515 0.8550 0.8317
0.773 27.79 520000 1.4455 0.8560 0.8338
0.7726 28.06 525000 1.4477 0.8560 0.8333
0.7699 28.33 530000 1.4427 0.8564 0.8327
0.7718 28.59 535000 1.4491 0.8556 0.8312
0.7713 28.86 540000 1.4397 0.8575 0.8352
0.7681 29.13 545000 1.4452 0.8574 0.8350
0.7693 29.39 550000 1.4472 0.8559 0.8329
0.77 29.66 555000 1.4437 0.8566 0.8336
0.7689 29.93 560000 1.4402 0.8575 0.8343

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.1.1+cu121
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
0
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for haryoaw/scenario-KD-PR-MSV-D2_data-AmazonScience_massive_all_1_155

Finetuned
(9)
this model