roberta-mc-4
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.4442
- Accuracy: 0.5
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.6061 | 1.0 | 24 | 1.5920 | 0.7 |
1.6032 | 2.0 | 48 | 1.5838 | 0.6 |
1.6104 | 3.0 | 72 | 1.5750 | 0.7 |
1.5851 | 4.0 | 96 | 1.5584 | 0.6 |
1.5653 | 5.0 | 120 | 1.5059 | 0.7 |
1.5485 | 6.0 | 144 | 1.4743 | 0.6 |
1.5175 | 7.0 | 168 | 1.4500 | 0.7 |
1.5025 | 8.0 | 192 | 1.4298 | 0.5 |
1.466 | 9.0 | 216 | 1.4559 | 0.5 |
1.4444 | 10.0 | 240 | 1.4010 | 0.5 |
1.4223 | 11.0 | 264 | 1.4699 | 0.4 |
1.3804 | 12.0 | 288 | 1.4915 | 0.4 |
1.3884 | 13.0 | 312 | 1.4624 | 0.4 |
1.3699 | 14.0 | 336 | 1.4798 | 0.5 |
1.3705 | 15.0 | 360 | 1.3615 | 0.5 |
1.3383 | 16.0 | 384 | 1.3814 | 0.7 |
1.3306 | 17.0 | 408 | 1.5099 | 0.4 |
1.2886 | 18.0 | 432 | 1.5039 | 0.4 |
1.2964 | 19.0 | 456 | 1.4033 | 0.5 |
1.285 | 20.0 | 480 | 1.4596 | 0.4 |
1.311 | 21.0 | 504 | 1.4100 | 0.4 |
1.218 | 22.0 | 528 | 1.3952 | 0.5 |
1.2193 | 23.0 | 552 | 1.2449 | 0.7 |
1.2618 | 24.0 | 576 | 1.2691 | 0.7 |
1.236 | 25.0 | 600 | 1.3427 | 0.7 |
1.1773 | 26.0 | 624 | 1.3669 | 0.5 |
1.1873 | 27.0 | 648 | 1.5114 | 0.5 |
1.1519 | 28.0 | 672 | 1.4285 | 0.6 |
1.1172 | 29.0 | 696 | 1.4485 | 0.5 |
1.0677 | 30.0 | 720 | 1.4442 | 0.5 |
Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3
- Downloads last month
- 7
Inference API (serverless) does not yet support transformers models for this pipeline type.