--- license: mit base_model: FacebookAI/xlm-roberta-base tags: - generated_from_trainer metrics: - accuracy - f1 model-index: - name: scenario-KD-PR-MSV-D2_data-cl-cardiff_cl_only_gamma-jason results: [] --- # scenario-KD-PR-MSV-D2_data-cl-cardiff_cl_only_gamma-jason This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 17.2063 - Accuracy: 0.3889 - F1: 0.3875 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 88458 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | No log | 1.09 | 250 | 12.0692 | 0.3727 | 0.3489 | | 14.0169 | 2.17 | 500 | 10.9338 | 0.3819 | 0.3797 | | 14.0169 | 3.26 | 750 | 10.7424 | 0.4059 | 0.4032 | | 11.6579 | 4.35 | 1000 | 11.5422 | 0.3927 | 0.3806 | | 11.6579 | 5.43 | 1250 | 11.1316 | 0.4012 | 0.4005 | | 9.9619 | 6.52 | 1500 | 11.8448 | 0.4105 | 0.4082 | | 9.9619 | 7.61 | 1750 | 12.5910 | 0.4174 | 0.4151 | | 8.6208 | 8.7 | 2000 | 11.4658 | 0.3989 | 0.3971 | | 8.6208 | 9.78 | 2250 | 12.6395 | 0.4066 | 0.4067 | | 7.4098 | 10.87 | 2500 | 12.9836 | 0.4035 | 0.3977 | | 7.4098 | 11.96 | 2750 | 13.6894 | 0.3966 | 0.3929 | | 6.5302 | 13.04 | 3000 | 15.4482 | 0.3997 | 0.3936 | | 6.5302 | 14.13 | 3250 | 14.9618 | 0.3688 | 0.3596 | | 5.6406 | 15.22 | 3500 | 14.4216 | 0.3943 | 0.3905 | | 5.6406 | 16.3 | 3750 | 15.1277 | 0.4035 | 0.4023 | | 5.0685 | 17.39 | 4000 | 15.6837 | 0.3920 | 0.3883 | | 5.0685 | 18.48 | 4250 | 16.1391 | 0.4020 | 0.4015 | | 4.3243 | 19.57 | 4500 | 16.0750 | 0.3773 | 0.3744 | | 4.3243 | 20.65 | 4750 | 16.0997 | 0.3989 | 0.3951 | | 4.0088 | 21.74 | 5000 | 16.2234 | 0.3966 | 0.3907 | | 4.0088 | 22.83 | 5250 | 16.6659 | 0.3850 | 0.3833 | | 3.5813 | 23.91 | 5500 | 16.1725 | 0.4136 | 0.4133 | | 3.5813 | 25.0 | 5750 | 16.7571 | 0.3873 | 0.3850 | | 3.2165 | 26.09 | 6000 | 17.0675 | 0.3951 | 0.3943 | | 3.2165 | 27.17 | 6250 | 16.7777 | 0.3951 | 0.3934 | | 3.0739 | 28.26 | 6500 | 17.1236 | 0.4043 | 0.4036 | | 3.0739 | 29.35 | 6750 | 17.2063 | 0.3889 | 0.3875 | ### Framework versions - Transformers 4.33.3 - Pytorch 2.1.1+cu121 - Datasets 2.14.5 - Tokenizers 0.13.3