File size: 4,861 Bytes
4ca9555 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 |
---
license: mit
base_model: FacebookAI/xlm-roberta-base
tags:
- generated_from_trainer
datasets:
- tweet_sentiment_multilingual
metrics:
- accuracy
- f1
model-index:
- name: scenario-KD-PR-CDF-ALL-D2_data-cardiffnlp_tweet_sentiment_multilingual_all_delta
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# scenario-KD-PR-CDF-ALL-D2_data-cardiffnlp_tweet_sentiment_multilingual_all_delta
This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the tweet_sentiment_multilingual dataset.
It achieves the following results on the evaluation set:
- Loss: 3.4263
- Accuracy: 0.5617
- F1: 0.5621
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 7777
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|
| 4.8568 | 1.09 | 500 | 3.9914 | 0.4734 | 0.4688 |
| 3.9413 | 2.17 | 1000 | 3.8048 | 0.5127 | 0.5070 |
| 3.4502 | 3.26 | 1500 | 3.5184 | 0.5289 | 0.5171 |
| 3.0935 | 4.35 | 2000 | 3.3541 | 0.5436 | 0.5418 |
| 2.7635 | 5.43 | 2500 | 3.3827 | 0.5444 | 0.5443 |
| 2.5494 | 6.52 | 3000 | 3.4817 | 0.5428 | 0.5440 |
| 2.3131 | 7.61 | 3500 | 3.3051 | 0.5640 | 0.5567 |
| 2.131 | 8.7 | 4000 | 3.2511 | 0.5548 | 0.5574 |
| 1.9564 | 9.78 | 4500 | 3.4609 | 0.5583 | 0.5544 |
| 1.8216 | 10.87 | 5000 | 3.2391 | 0.5502 | 0.5520 |
| 1.7048 | 11.96 | 5500 | 3.2188 | 0.5525 | 0.5531 |
| 1.575 | 13.04 | 6000 | 3.2912 | 0.5637 | 0.5611 |
| 1.4693 | 14.13 | 6500 | 3.5853 | 0.5629 | 0.5628 |
| 1.4236 | 15.22 | 7000 | 3.3838 | 0.5421 | 0.5441 |
| 1.3372 | 16.3 | 7500 | 3.5262 | 0.5590 | 0.5583 |
| 1.2829 | 17.39 | 8000 | 3.6001 | 0.5552 | 0.5535 |
| 1.2351 | 18.48 | 8500 | 3.3745 | 0.5525 | 0.5513 |
| 1.1562 | 19.57 | 9000 | 3.3239 | 0.5706 | 0.5726 |
| 1.1264 | 20.65 | 9500 | 3.4648 | 0.5490 | 0.5507 |
| 1.0806 | 21.74 | 10000 | 3.4269 | 0.5652 | 0.5652 |
| 1.066 | 22.83 | 10500 | 3.3415 | 0.5613 | 0.5617 |
| 1.0144 | 23.91 | 11000 | 3.5331 | 0.5610 | 0.5623 |
| 0.9746 | 25.0 | 11500 | 3.5136 | 0.5625 | 0.5625 |
| 0.9415 | 26.09 | 12000 | 3.5623 | 0.5540 | 0.5530 |
| 0.932 | 27.17 | 12500 | 3.5626 | 0.5633 | 0.5622 |
| 0.9016 | 28.26 | 13000 | 3.6071 | 0.5467 | 0.5460 |
| 0.8847 | 29.35 | 13500 | 3.5201 | 0.5513 | 0.5519 |
| 0.8713 | 30.43 | 14000 | 3.5412 | 0.5660 | 0.5655 |
| 0.8459 | 31.52 | 14500 | 3.5206 | 0.5556 | 0.5554 |
| 0.8255 | 32.61 | 15000 | 3.4715 | 0.5552 | 0.5563 |
| 0.8082 | 33.7 | 15500 | 3.4875 | 0.5579 | 0.5584 |
| 0.7899 | 34.78 | 16000 | 3.4935 | 0.5775 | 0.5758 |
| 0.7958 | 35.87 | 16500 | 3.4224 | 0.5544 | 0.5555 |
| 0.7745 | 36.96 | 17000 | 3.3893 | 0.5671 | 0.5686 |
| 0.7666 | 38.04 | 17500 | 3.3972 | 0.5629 | 0.5640 |
| 0.7574 | 39.13 | 18000 | 3.5453 | 0.5706 | 0.5698 |
| 0.7468 | 40.22 | 18500 | 3.4342 | 0.5671 | 0.5660 |
| 0.7449 | 41.3 | 19000 | 3.3906 | 0.5640 | 0.5642 |
| 0.7338 | 42.39 | 19500 | 3.4109 | 0.5721 | 0.5728 |
| 0.7157 | 43.48 | 20000 | 3.3499 | 0.5721 | 0.5717 |
| 0.7285 | 44.57 | 20500 | 3.2780 | 0.5718 | 0.5718 |
| 0.7101 | 45.65 | 21000 | 3.3873 | 0.5648 | 0.5653 |
| 0.7144 | 46.74 | 21500 | 3.4731 | 0.5613 | 0.5621 |
| 0.7158 | 47.83 | 22000 | 3.4394 | 0.5733 | 0.5728 |
| 0.7016 | 48.91 | 22500 | 3.4609 | 0.5544 | 0.5545 |
| 0.7055 | 50.0 | 23000 | 3.4263 | 0.5617 | 0.5621 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.13.3
|