uniBERT.RoBERTa.1
This model is a fine-tuned version of FacebookAI/roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.5672
- Accuracy: (0.5442359249329759,)
- F1: (0.5475073377350592,)
- Precision: (0.5667013774255618,)
- Recall: 0.5442
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
3.2351 | 1.0 | 163 | 2.9595 | (0.13136729222520108,) | (0.08462626197008986,) | (0.15744495937224648,) | 0.1314 |
2.6973 | 2.0 | 326 | 2.3531 | (0.2770330652368186,) | (0.24847446131015896,) | (0.3687711586389575,) | 0.2770 |
2.149 | 3.0 | 489 | 1.9958 | (0.37980339588918677,) | (0.375810644629823,) | (0.46219745175706833,) | 0.3798 |
1.7 | 4.0 | 652 | 1.8677 | (0.40303842716711347,) | (0.40227689144186507,) | (0.4955990588890976,) | 0.4030 |
1.4386 | 5.0 | 815 | 1.7207 | (0.4709562109025916,) | (0.47394753337956536,) | (0.5204593065553081,) | 0.4710 |
1.1631 | 6.0 | 978 | 1.6443 | (0.49955317247542447,) | (0.5021533796342853,) | (0.532760172396733,) | 0.4996 |
1.0801 | 7.0 | 1141 | 1.6049 | (0.5192135835567471,) | (0.5209002930812098,) | (0.5501889941115035,) | 0.5192 |
0.9166 | 8.0 | 1304 | 1.5975 | (0.5299374441465594,) | (0.5346686129926844,) | (0.5631677420777581,) | 0.5299 |
0.7877 | 9.0 | 1467 | 1.5711 | (0.5397676496872207,) | (0.5438919405653636,) | (0.5628843016520076,) | 0.5398 |
0.8112 | 10.0 | 1630 | 1.5672 | (0.5442359249329759,) | (0.5475073377350592,) | (0.5667013774255618,) | 0.5442 |
Framework versions
- Transformers 4.39.3
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 2