--- license: mit base_model: xlm-roberta-base tags: - generated_from_trainer datasets: - xtreme metrics: - f1 model-index: - name: xlm-roberta-base-finetuned-panx-hi results: - task: name: Token Classification type: token-classification dataset: name: xtreme type: xtreme config: PAN-X.hi split: validation args: PAN-X.hi metrics: - name: F1 type: f1 value: 0.8699106256206554 --- # xlm-roberta-base-finetuned-panx-hi This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the xtreme dataset. It achieves the following results on the evaluation set: - Loss: 0.5351 - F1: 0.8699 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.843 | 1.0 | 86 | 0.3921 | 0.7206 | | 0.3751 | 2.0 | 172 | 0.3292 | 0.7381 | | 0.2437 | 3.0 | 258 | 0.3049 | 0.8255 | | 0.1963 | 4.0 | 344 | 0.2865 | 0.8255 | | 0.1519 | 5.0 | 430 | 0.3165 | 0.8422 | | 0.1332 | 6.0 | 516 | 0.3019 | 0.8433 | | 0.0748 | 7.0 | 602 | 0.3290 | 0.8474 | | 0.0703 | 8.0 | 688 | 0.3825 | 0.8517 | | 0.0611 | 9.0 | 774 | 0.4575 | 0.8518 | | 0.0484 | 10.0 | 860 | 0.4409 | 0.8481 | | 0.0345 | 11.0 | 946 | 0.4221 | 0.8597 | | 0.0242 | 12.0 | 1032 | 0.4673 | 0.8682 | | 0.0178 | 13.0 | 1118 | 0.4826 | 0.8671 | | 0.0129 | 14.0 | 1204 | 0.5174 | 0.8443 | | 0.0139 | 15.0 | 1290 | 0.5028 | 0.8608 | | 0.0063 | 16.0 | 1376 | 0.5224 | 0.8648 | | 0.0027 | 17.0 | 1462 | 0.5260 | 0.8682 | | 0.0036 | 18.0 | 1548 | 0.5301 | 0.8702 | | 0.0028 | 19.0 | 1634 | 0.5372 | 0.8741 | | 0.0038 | 20.0 | 1720 | 0.5351 | 0.8699 | ### Framework versions - Transformers 4.33.1 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.13.3