--- license: mit base_model: nielsr/lilt-xlm-roberta-base tags: - generated_from_trainer datasets: - xfun metrics: - precision - recall - f1 model-index: - name: checkpoints results: [] --- # checkpoints This model is a fine-tuned version of [nielsr/lilt-xlm-roberta-base](https://huggingface.co/nielsr/lilt-xlm-roberta-base) on the xfun dataset. It achieves the following results on the evaluation set: - Precision: 0.3111 - Recall: 0.5225 - F1: 0.3900 - Loss: 0.1579 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | F1 | Validation Loss | Precision | Recall | |:-------------:|:------:|:-----:|:------:|:---------------:|:---------:|:------:| | 0.2077 | 16.13 | 500 | 0 | 0.2127 | 0 | 0 | | 0.1792 | 32.26 | 1000 | 0.2520 | 0.1668 | 0.2345 | 0.2723 | | 0.1063 | 48.39 | 1500 | 0.2491 | 0.1439 | 0.5851 | 0.1582 | | 0.1147 | 64.52 | 2000 | 0.3900 | 0.1579 | 0.3111 | 0.5225 | | 0.0718 | 80.65 | 2500 | 0.4216 | 0.2598 | 0.3328 | 0.5753 | | 0.0503 | 96.77 | 3000 | 0.4471 | 0.1888 | 0.3563 | 0.6002 | | 0.0823 | 112.9 | 3500 | 0.4302 | 0.2690 | 0.3157 | 0.6750 | | 0.0586 | 129.03 | 4000 | 0.4360 | 0.2429 | 0.3211 | 0.6788 | | 0.0604 | 145.16 | 4500 | 0.4578 | 0.2745 | 0.3503 | 0.6606 | | 0.0603 | 161.29 | 5000 | 0.4630 | 0.2694 | 0.3483 | 0.6903 | | 0.0434 | 177.42 | 5500 | 0.4575 | 0.3200 | 0.3417 | 0.6922 | | 0.0367 | 193.55 | 6000 | 0.4523 | 0.2991 | 0.3321 | 0.7085 | | 0.0402 | 209.68 | 6500 | 0.4664 | 0.2628 | 0.3507 | 0.6961 | | 0.027 | 225.81 | 7000 | 0.4671 | 0.3375 | 0.3495 | 0.7037 | | 0.0363 | 241.94 | 7500 | 0.3445 | 0.7018 | 0.4621 | 0.3380 | | 0.0411 | 258.06 | 8000 | 0.3641 | 0.6769 | 0.4735 | 0.2984 | | 0.0348 | 274.19 | 8500 | 0.3530 | 0.6951 | 0.4682 | 0.3455 | | 0.0031 | 290.32 | 9000 | 0.3510 | 0.6999 | 0.4675 | 0.3841 | | 0.0259 | 306.45 | 9500 | 0.3532 | 0.6989 | 0.4693 | 0.3586 | | 0.0129 | 322.58 | 10000 | 0.3513 | 0.7009 | 0.4680 | 0.3604 | ### Framework versions - Transformers 4.38.2 - Pytorch 2.1.0+cu121 - Datasets 2.18.0 - Tokenizers 0.15.1