--- license: mit base_model: FacebookAI/xlm-roberta-base tags: - generated_from_trainer model-index: - name: xlm-roberta-base_afr_corr_5e-06 results: [] --- # xlm-roberta-base_afr_corr_5e-06 This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0220 - Spearman Corr: 0.7641 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 32 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Spearman Corr | |:-------------:|:-----:|:----:|:---------------:|:-------------:| | No log | 0.85 | 200 | 0.0322 | 0.6463 | | No log | 1.69 | 400 | 0.0245 | 0.7044 | | 0.0562 | 2.54 | 600 | 0.0259 | 0.7112 | | 0.0562 | 3.38 | 800 | 0.0239 | 0.7265 | | 0.0271 | 4.23 | 1000 | 0.0240 | 0.7442 | | 0.0271 | 5.07 | 1200 | 0.0228 | 0.7573 | | 0.0271 | 5.92 | 1400 | 0.0231 | 0.7572 | | 0.0229 | 6.77 | 1600 | 0.0227 | 0.7518 | | 0.0229 | 7.61 | 1800 | 0.0218 | 0.7657 | | 0.0205 | 8.46 | 2000 | 0.0220 | 0.7629 | | 0.0205 | 9.3 | 2200 | 0.0213 | 0.7704 | | 0.0186 | 10.15 | 2400 | 0.0222 | 0.7669 | | 0.0186 | 10.99 | 2600 | 0.0226 | 0.7696 | | 0.0186 | 11.84 | 2800 | 0.0218 | 0.7697 | | 0.0168 | 12.68 | 3000 | 0.0217 | 0.7679 | | 0.0168 | 13.53 | 3200 | 0.0235 | 0.7619 | | 0.0153 | 14.38 | 3400 | 0.0221 | 0.7696 | | 0.0153 | 15.22 | 3600 | 0.0225 | 0.7650 | | 0.0144 | 16.07 | 3800 | 0.0220 | 0.7641 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.2.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.2