haryoaw's picture
Initial Commit
eeceb1b verified
---
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
datasets:
- massive
metrics:
- accuracy
- f1
model-index:
- name: scenario-NON-KD-SCR-D2_data-AmazonScience_massive_all_1_1_alpha-jason
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: massive
type: massive
config: all_1.1
split: validation
args: all_1.1
metrics:
- name: Accuracy
type: accuracy
value: 0.8063396269249687
- name: F1
type: f1
value: 0.7734773768161987
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# scenario-NON-KD-SCR-D2_data-AmazonScience_massive_all_1_1_alpha-jason
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the massive dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9035
- Accuracy: 0.8063
- F1: 0.7735
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 111
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:------:|:---------------:|:--------:|:------:|
| 1.5552 | 0.27 | 5000 | 1.5319 | 0.5845 | 0.4786 |
| 1.0578 | 0.53 | 10000 | 1.0587 | 0.7169 | 0.6341 |
| 0.883 | 0.8 | 15000 | 0.9303 | 0.7512 | 0.6959 |
| 0.6259 | 1.07 | 20000 | 0.8622 | 0.7759 | 0.7247 |
| 0.5994 | 1.34 | 25000 | 0.8258 | 0.7854 | 0.7403 |
| 0.6048 | 1.6 | 30000 | 0.7925 | 0.7930 | 0.7466 |
| 0.5577 | 1.87 | 35000 | 0.7766 | 0.7987 | 0.7500 |
| 0.3568 | 2.14 | 40000 | 0.8502 | 0.8004 | 0.7605 |
| 0.3809 | 2.41 | 45000 | 0.8274 | 0.7973 | 0.7671 |
| 0.3825 | 2.67 | 50000 | 0.8014 | 0.8059 | 0.7723 |
| 0.3808 | 2.94 | 55000 | 0.8177 | 0.8068 | 0.7701 |
| 0.2409 | 3.21 | 60000 | 0.8972 | 0.8040 | 0.7748 |
| 0.2636 | 3.47 | 65000 | 0.8961 | 0.8053 | 0.7667 |
| 0.2662 | 3.74 | 70000 | 0.8865 | 0.8035 | 0.7687 |
| 0.2355 | 4.01 | 75000 | 0.9449 | 0.8076 | 0.7742 |
| 0.1731 | 4.28 | 80000 | 1.0169 | 0.8049 | 0.7672 |
| 0.1972 | 4.54 | 85000 | 0.9849 | 0.8065 | 0.7753 |
| 0.2029 | 4.81 | 90000 | 0.9689 | 0.8089 | 0.7772 |
| 0.1278 | 5.08 | 95000 | 1.0929 | 0.8076 | 0.7785 |
| 0.1453 | 5.34 | 100000 | 1.0971 | 0.8082 | 0.7786 |
| 0.1534 | 5.61 | 105000 | 1.0825 | 0.8046 | 0.7760 |
| 0.1538 | 5.88 | 110000 | 1.0960 | 0.8084 | 0.7769 |
| 0.0979 | 6.15 | 115000 | 1.2774 | 0.8015 | 0.7706 |
| 0.1093 | 6.41 | 120000 | 1.2227 | 0.8060 | 0.7785 |
| 0.1149 | 6.68 | 125000 | 1.2517 | 0.8085 | 0.7784 |
| 0.1239 | 6.95 | 130000 | 1.2183 | 0.8073 | 0.7747 |
| 0.0908 | 7.22 | 135000 | 1.2683 | 0.8062 | 0.7758 |
| 0.1043 | 7.48 | 140000 | 1.2992 | 0.8065 | 0.7781 |
| 0.0971 | 7.75 | 145000 | 1.2978 | 0.8062 | 0.7752 |
| 0.0872 | 8.02 | 150000 | 1.3343 | 0.8046 | 0.7745 |
| 0.0762 | 8.28 | 155000 | 1.4315 | 0.8037 | 0.7793 |
| 0.0856 | 8.55 | 160000 | 1.3695 | 0.8068 | 0.7804 |
| 0.0923 | 8.82 | 165000 | 1.3585 | 0.8077 | 0.7811 |
| 0.0611 | 9.09 | 170000 | 1.4557 | 0.8039 | 0.7754 |
| 0.0671 | 9.35 | 175000 | 1.4726 | 0.8029 | 0.7708 |
| 0.0711 | 9.62 | 180000 | 1.4840 | 0.8042 | 0.7728 |
| 0.0757 | 9.89 | 185000 | 1.4514 | 0.8029 | 0.7702 |
| 0.0543 | 10.15 | 190000 | 1.5208 | 0.8046 | 0.7731 |
| 0.0527 | 10.42 | 195000 | 1.6045 | 0.8019 | 0.7725 |
| 0.064 | 10.69 | 200000 | 1.4989 | 0.8038 | 0.7742 |
| 0.0616 | 10.96 | 205000 | 1.5399 | 0.8037 | 0.7727 |
| 0.0543 | 11.22 | 210000 | 1.4915 | 0.8081 | 0.7783 |
| 0.0506 | 11.49 | 215000 | 1.5569 | 0.8044 | 0.7728 |
| 0.063 | 11.76 | 220000 | 1.5712 | 0.8000 | 0.7725 |
| 0.0372 | 12.03 | 225000 | 1.6183 | 0.8029 | 0.7732 |
| 0.0449 | 12.29 | 230000 | 1.6299 | 0.8006 | 0.7740 |
| 0.0522 | 12.56 | 235000 | 1.6166 | 0.8030 | 0.7714 |
| 0.048 | 12.83 | 240000 | 1.6537 | 0.8014 | 0.7720 |
| 0.0354 | 13.09 | 245000 | 1.6848 | 0.8031 | 0.7732 |
| 0.0394 | 13.36 | 250000 | 1.6748 | 0.8014 | 0.7713 |
| 0.0427 | 13.63 | 255000 | 1.6233 | 0.8026 | 0.7715 |
| 0.0499 | 13.9 | 260000 | 1.6319 | 0.8028 | 0.7749 |
| 0.0331 | 14.16 | 265000 | 1.6896 | 0.8028 | 0.7734 |
| 0.0383 | 14.43 | 270000 | 1.6646 | 0.8023 | 0.7723 |
| 0.0476 | 14.7 | 275000 | 1.6470 | 0.8024 | 0.7730 |
| 0.0484 | 14.96 | 280000 | 1.6553 | 0.8012 | 0.7721 |
| 0.0382 | 15.23 | 285000 | 1.6914 | 0.8003 | 0.7689 |
| 0.0386 | 15.5 | 290000 | 1.7338 | 0.8025 | 0.7720 |
| 0.0388 | 15.77 | 295000 | 1.7424 | 0.8005 | 0.7708 |
| 0.023 | 16.03 | 300000 | 1.7477 | 0.8034 | 0.7745 |
| 0.028 | 16.3 | 305000 | 1.7383 | 0.8026 | 0.7734 |
| 0.0323 | 16.57 | 310000 | 1.7738 | 0.8019 | 0.7702 |
| 0.032 | 16.84 | 315000 | 1.7840 | 0.8021 | 0.7735 |
| 0.0247 | 17.1 | 320000 | 1.7916 | 0.8034 | 0.7707 |
| 0.0278 | 17.37 | 325000 | 1.7800 | 0.8019 | 0.7751 |
| 0.0293 | 17.64 | 330000 | 1.8049 | 0.8016 | 0.7687 |
| 0.0354 | 17.9 | 335000 | 1.7460 | 0.8024 | 0.7671 |
| 0.0204 | 18.17 | 340000 | 1.8295 | 0.8002 | 0.7687 |
| 0.0262 | 18.44 | 345000 | 1.7830 | 0.8026 | 0.7689 |
| 0.0277 | 18.71 | 350000 | 1.8273 | 0.8010 | 0.7688 |
| 0.0285 | 18.97 | 355000 | 1.8188 | 0.8012 | 0.7701 |
| 0.0236 | 19.24 | 360000 | 1.8336 | 0.8008 | 0.7676 |
| 0.0235 | 19.51 | 365000 | 1.8579 | 0.8013 | 0.7688 |
| 0.0215 | 19.77 | 370000 | 1.8419 | 0.8030 | 0.7738 |
| 0.0143 | 20.04 | 375000 | 1.8498 | 0.8023 | 0.7713 |
| 0.0231 | 20.31 | 380000 | 1.8420 | 0.8013 | 0.7699 |
| 0.0177 | 20.58 | 385000 | 1.8397 | 0.8027 | 0.7736 |
| 0.0278 | 20.84 | 390000 | 1.8459 | 0.7993 | 0.7664 |
| 0.0153 | 21.11 | 395000 | 1.8486 | 0.8005 | 0.7706 |
| 0.0152 | 21.38 | 400000 | 1.8825 | 0.8030 | 0.7700 |
| 0.0185 | 21.65 | 405000 | 1.8098 | 0.8044 | 0.7724 |
| 0.0129 | 21.91 | 410000 | 1.8306 | 0.8030 | 0.7662 |
| 0.0136 | 22.18 | 415000 | 1.9011 | 0.8026 | 0.7680 |
| 0.0167 | 22.45 | 420000 | 1.8608 | 0.8024 | 0.7698 |
| 0.0144 | 22.71 | 425000 | 1.8313 | 0.8040 | 0.7716 |
| 0.0152 | 22.98 | 430000 | 1.8538 | 0.8035 | 0.7695 |
| 0.0116 | 23.25 | 435000 | 1.8521 | 0.8043 | 0.7734 |
| 0.0146 | 23.52 | 440000 | 1.8894 | 0.8023 | 0.7685 |
| 0.0144 | 23.78 | 445000 | 1.8697 | 0.8031 | 0.7700 |
| 0.0096 | 24.05 | 450000 | 1.9006 | 0.8018 | 0.7696 |
| 0.0124 | 24.32 | 455000 | 1.8807 | 0.8048 | 0.7722 |
| 0.0143 | 24.58 | 460000 | 1.8737 | 0.8025 | 0.7656 |
| 0.0156 | 24.85 | 465000 | 1.8611 | 0.8042 | 0.7723 |
| 0.008 | 25.12 | 470000 | 1.8998 | 0.8035 | 0.7733 |
| 0.0115 | 25.39 | 475000 | 1.9243 | 0.8026 | 0.7724 |
| 0.0133 | 25.65 | 480000 | 1.9014 | 0.8027 | 0.7693 |
| 0.0101 | 25.92 | 485000 | 1.8664 | 0.8046 | 0.7731 |
| 0.0079 | 26.19 | 490000 | 1.8896 | 0.8039 | 0.7676 |
| 0.0108 | 26.46 | 495000 | 1.8998 | 0.8057 | 0.7727 |
| 0.0084 | 26.72 | 500000 | 1.8500 | 0.8023 | 0.7695 |
| 0.0119 | 26.99 | 505000 | 1.8798 | 0.8051 | 0.7724 |
| 0.0089 | 27.26 | 510000 | 1.8926 | 0.8044 | 0.7721 |
| 0.0085 | 27.52 | 515000 | 1.8820 | 0.8056 | 0.7745 |
| 0.007 | 27.79 | 520000 | 1.8751 | 0.8047 | 0.7721 |
| 0.0061 | 28.06 | 525000 | 1.8955 | 0.8060 | 0.7733 |
| 0.0073 | 28.33 | 530000 | 1.9120 | 0.8049 | 0.7734 |
| 0.0095 | 28.59 | 535000 | 1.8995 | 0.8055 | 0.7724 |
| 0.0095 | 28.86 | 540000 | 1.8815 | 0.8058 | 0.7751 |
| 0.0067 | 29.13 | 545000 | 1.9046 | 0.8062 | 0.7734 |
| 0.0074 | 29.39 | 550000 | 1.8968 | 0.8060 | 0.7730 |
| 0.0064 | 29.66 | 555000 | 1.9066 | 0.8062 | 0.7740 |
| 0.0054 | 29.93 | 560000 | 1.9035 | 0.8063 | 0.7735 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.13.3