scenario-NON-KD-SCR-D2_data-AmazonScience_massive_all_1_1_a
This model is a fine-tuned version of xlm-roberta-base on the massive dataset. It achieves the following results on the evaluation set:
- Loss: 1.9357
- Accuracy: 0.8039
- F1: 0.7712
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 1234
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
1.454 | 0.27 | 5000 | 1.4197 | 0.6209 | 0.4886 |
1.0647 | 0.53 | 10000 | 1.0842 | 0.7120 | 0.6261 |
0.8705 | 0.8 | 15000 | 0.9180 | 0.7564 | 0.6880 |
0.6356 | 1.07 | 20000 | 0.8676 | 0.7720 | 0.7144 |
0.6142 | 1.34 | 25000 | 0.8565 | 0.7817 | 0.7226 |
0.5872 | 1.6 | 30000 | 0.7959 | 0.7939 | 0.7496 |
0.5608 | 1.87 | 35000 | 0.7632 | 0.8014 | 0.7591 |
0.3801 | 2.14 | 40000 | 0.8418 | 0.8005 | 0.7655 |
0.3965 | 2.41 | 45000 | 0.8148 | 0.8014 | 0.7618 |
0.3972 | 2.67 | 50000 | 0.8015 | 0.8068 | 0.7705 |
0.3883 | 2.94 | 55000 | 0.8273 | 0.8067 | 0.7724 |
0.2503 | 3.21 | 60000 | 0.8977 | 0.8061 | 0.7689 |
0.2428 | 3.47 | 65000 | 0.9087 | 0.8052 | 0.7704 |
0.2783 | 3.74 | 70000 | 0.8834 | 0.8064 | 0.7748 |
0.2424 | 4.01 | 75000 | 0.9144 | 0.8117 | 0.7784 |
0.1776 | 4.28 | 80000 | 1.0038 | 0.8060 | 0.7729 |
0.1865 | 4.54 | 85000 | 1.0090 | 0.8067 | 0.7714 |
0.1888 | 4.81 | 90000 | 0.9998 | 0.8038 | 0.7759 |
0.1304 | 5.08 | 95000 | 1.1070 | 0.8052 | 0.7786 |
0.1323 | 5.34 | 100000 | 1.1318 | 0.8069 | 0.7761 |
0.1462 | 5.61 | 105000 | 1.1149 | 0.8053 | 0.7779 |
0.1561 | 5.88 | 110000 | 1.1245 | 0.8064 | 0.7801 |
0.103 | 6.15 | 115000 | 1.1859 | 0.8096 | 0.7761 |
0.1136 | 6.41 | 120000 | 1.2550 | 0.8030 | 0.7703 |
0.1239 | 6.68 | 125000 | 1.2233 | 0.8075 | 0.7801 |
0.1271 | 6.95 | 130000 | 1.1780 | 0.8073 | 0.7802 |
0.0871 | 7.22 | 135000 | 1.3128 | 0.8080 | 0.7820 |
0.0951 | 7.48 | 140000 | 1.2719 | 0.8047 | 0.7775 |
0.0988 | 7.75 | 145000 | 1.3432 | 0.7982 | 0.7751 |
0.0771 | 8.02 | 150000 | 1.3020 | 0.8071 | 0.7786 |
0.0672 | 8.28 | 155000 | 1.4223 | 0.8041 | 0.7734 |
0.0847 | 8.55 | 160000 | 1.3962 | 0.8078 | 0.7800 |
0.092 | 8.82 | 165000 | 1.3453 | 0.8066 | 0.7798 |
0.0648 | 9.09 | 170000 | 1.4176 | 0.8058 | 0.7741 |
0.0619 | 9.35 | 175000 | 1.4822 | 0.8037 | 0.7717 |
0.0688 | 9.62 | 180000 | 1.4999 | 0.8028 | 0.7743 |
0.0791 | 9.89 | 185000 | 1.4341 | 0.8016 | 0.7721 |
0.0504 | 10.15 | 190000 | 1.5672 | 0.7990 | 0.7748 |
0.0552 | 10.42 | 195000 | 1.5455 | 0.7998 | 0.7657 |
0.0583 | 10.69 | 200000 | 1.5694 | 0.8031 | 0.7757 |
0.0668 | 10.96 | 205000 | 1.5405 | 0.8021 | 0.7691 |
0.0477 | 11.22 | 210000 | 1.6250 | 0.8026 | 0.7759 |
0.0492 | 11.49 | 215000 | 1.5618 | 0.8016 | 0.7732 |
0.0544 | 11.76 | 220000 | 1.5334 | 0.8059 | 0.7777 |
0.0422 | 12.03 | 225000 | 1.5712 | 0.8029 | 0.7740 |
0.0456 | 12.29 | 230000 | 1.6212 | 0.8013 | 0.7676 |
0.0457 | 12.56 | 235000 | 1.6151 | 0.8041 | 0.7727 |
0.056 | 12.83 | 240000 | 1.6279 | 0.8015 | 0.7680 |
0.0294 | 13.09 | 245000 | 1.6893 | 0.8005 | 0.7680 |
0.0399 | 13.36 | 250000 | 1.6776 | 0.8013 | 0.7746 |
0.0432 | 13.63 | 255000 | 1.6312 | 0.8030 | 0.7751 |
0.0431 | 13.9 | 260000 | 1.6691 | 0.7985 | 0.7691 |
0.0346 | 14.16 | 265000 | 1.6845 | 0.8017 | 0.7731 |
0.0384 | 14.43 | 270000 | 1.6804 | 0.8047 | 0.7768 |
0.0426 | 14.7 | 275000 | 1.7049 | 0.8025 | 0.7762 |
0.045 | 14.96 | 280000 | 1.6726 | 0.7994 | 0.7679 |
0.032 | 15.23 | 285000 | 1.6661 | 0.8021 | 0.7755 |
0.0358 | 15.5 | 290000 | 1.7243 | 0.8006 | 0.7718 |
0.0389 | 15.77 | 295000 | 1.7042 | 0.8046 | 0.7745 |
0.023 | 16.03 | 300000 | 1.7534 | 0.8031 | 0.7754 |
0.0327 | 16.3 | 305000 | 1.7461 | 0.8004 | 0.7704 |
0.0335 | 16.57 | 310000 | 1.6954 | 0.7989 | 0.7662 |
0.0329 | 16.84 | 315000 | 1.7706 | 0.7988 | 0.7702 |
0.0225 | 17.1 | 320000 | 1.7914 | 0.8023 | 0.7769 |
0.0251 | 17.37 | 325000 | 1.8157 | 0.8004 | 0.7709 |
0.0294 | 17.64 | 330000 | 1.7378 | 0.8035 | 0.7753 |
0.028 | 17.9 | 335000 | 1.7316 | 0.8025 | 0.7710 |
0.0214 | 18.17 | 340000 | 1.8072 | 0.7999 | 0.7719 |
0.026 | 18.44 | 345000 | 1.8268 | 0.7992 | 0.7652 |
0.0258 | 18.71 | 350000 | 1.8022 | 0.8013 | 0.7673 |
0.0279 | 18.97 | 355000 | 1.7685 | 0.8030 | 0.7714 |
0.0227 | 19.24 | 360000 | 1.7676 | 0.8025 | 0.7727 |
0.0205 | 19.51 | 365000 | 1.8102 | 0.8021 | 0.7700 |
0.0199 | 19.77 | 370000 | 1.8436 | 0.8013 | 0.7695 |
0.0166 | 20.04 | 375000 | 1.8083 | 0.8037 | 0.7695 |
0.0223 | 20.31 | 380000 | 1.8301 | 0.8022 | 0.7689 |
0.0147 | 20.58 | 385000 | 1.7891 | 0.8032 | 0.7740 |
0.0206 | 20.84 | 390000 | 1.8506 | 0.8002 | 0.7708 |
0.0139 | 21.11 | 395000 | 1.8328 | 0.8043 | 0.7746 |
0.0171 | 21.38 | 400000 | 1.8415 | 0.8041 | 0.7706 |
0.0176 | 21.65 | 405000 | 1.8163 | 0.8016 | 0.7669 |
0.0173 | 21.91 | 410000 | 1.8412 | 0.8016 | 0.7699 |
0.0146 | 22.18 | 415000 | 1.8712 | 0.8023 | 0.7711 |
0.0167 | 22.45 | 420000 | 1.8846 | 0.7982 | 0.7651 |
0.0135 | 22.71 | 425000 | 1.8974 | 0.8026 | 0.7676 |
0.0169 | 22.98 | 430000 | 1.8428 | 0.8021 | 0.7687 |
0.0131 | 23.25 | 435000 | 1.9039 | 0.8010 | 0.7683 |
0.0143 | 23.52 | 440000 | 1.8806 | 0.8002 | 0.7661 |
0.0121 | 23.78 | 445000 | 1.8893 | 0.8039 | 0.7725 |
0.008 | 24.05 | 450000 | 1.9267 | 0.8018 | 0.7695 |
0.016 | 24.32 | 455000 | 1.8843 | 0.8028 | 0.7708 |
0.0133 | 24.58 | 460000 | 1.8713 | 0.8030 | 0.7705 |
0.0122 | 24.85 | 465000 | 1.8984 | 0.8010 | 0.7663 |
0.0129 | 25.12 | 470000 | 1.9349 | 0.8018 | 0.7678 |
0.0126 | 25.39 | 475000 | 1.9035 | 0.8019 | 0.7694 |
0.0145 | 25.65 | 480000 | 1.8795 | 0.8048 | 0.7723 |
0.0107 | 25.92 | 485000 | 1.8795 | 0.8034 | 0.7708 |
0.0084 | 26.19 | 490000 | 1.9312 | 0.8017 | 0.7696 |
0.0103 | 26.46 | 495000 | 1.9346 | 0.8027 | 0.7694 |
0.0081 | 26.72 | 500000 | 1.9338 | 0.8033 | 0.7721 |
0.0124 | 26.99 | 505000 | 1.8983 | 0.8025 | 0.7704 |
0.0115 | 27.26 | 510000 | 1.8886 | 0.8029 | 0.7695 |
0.0076 | 27.52 | 515000 | 1.9516 | 0.8021 | 0.7697 |
0.0091 | 27.79 | 520000 | 1.9145 | 0.8020 | 0.7713 |
0.0067 | 28.06 | 525000 | 1.9112 | 0.8034 | 0.7718 |
0.008 | 28.33 | 530000 | 1.9403 | 0.8034 | 0.7702 |
0.0091 | 28.59 | 535000 | 1.9270 | 0.8033 | 0.7713 |
0.0085 | 28.86 | 540000 | 1.9263 | 0.8047 | 0.7716 |
0.0081 | 29.13 | 545000 | 1.9374 | 0.8051 | 0.7731 |
0.0075 | 29.39 | 550000 | 1.9400 | 0.8041 | 0.7713 |
0.0077 | 29.66 | 555000 | 1.9347 | 0.8042 | 0.7715 |
0.008 | 29.93 | 560000 | 1.9357 | 0.8039 | 0.7712 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.13.3
- Downloads last month
- 7
Finetuned from
Evaluation results
- Accuracy on massivevalidation set self-reported0.804
- F1 on massivevalidation set self-reported0.771