--- tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: xlm-roberta-base-david results: [] --- # xlm-roberta-base-david This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0697 - Precision: 0.9497 - Recall: 0.9544 - F1: 0.9520 - Accuracy: 0.9864 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 1.2104 | 0.1 | 100 | 0.6752 | 0.1279 | 0.0587 | 0.0804 | 0.7761 | | 0.5384 | 0.2 | 200 | 0.3366 | 0.2616 | 0.2119 | 0.2341 | 0.8771 | | 0.3168 | 0.3 | 300 | 0.2264 | 0.5493 | 0.4996 | 0.5233 | 0.9211 | | 0.2345 | 0.39 | 400 | 0.1796 | 0.6662 | 0.8297 | 0.7390 | 0.9395 | | 0.1883 | 0.49 | 500 | 0.1687 | 0.7203 | 0.8207 | 0.7672 | 0.9413 | | 0.1587 | 0.59 | 600 | 0.1414 | 0.7661 | 0.8354 | 0.7992 | 0.9525 | | 0.1605 | 0.69 | 700 | 0.1209 | 0.7946 | 0.8672 | 0.8293 | 0.9609 | | 0.1365 | 0.79 | 800 | 0.1120 | 0.8304 | 0.8696 | 0.8495 | 0.9657 | | 0.1205 | 0.89 | 900 | 0.1098 | 0.8659 | 0.8786 | 0.8722 | 0.9683 | | 0.1353 | 0.99 | 1000 | 0.1239 | 0.8436 | 0.8794 | 0.8611 | 0.9643 | | 0.1083 | 1.09 | 1100 | 0.1243 | 0.8537 | 0.8892 | 0.8711 | 0.9657 | | 0.0961 | 1.18 | 1200 | 0.1078 | 0.8689 | 0.8965 | 0.8825 | 0.9696 | | 0.0798 | 1.28 | 1300 | 0.0995 | 0.8774 | 0.9038 | 0.8904 | 0.9724 | | 0.0843 | 1.38 | 1400 | 0.0965 | 0.8793 | 0.9144 | 0.8965 | 0.9733 | | 0.0923 | 1.48 | 1500 | 0.0957 | 0.8815 | 0.9030 | 0.8921 | 0.9730 | | 0.0847 | 1.58 | 1600 | 0.0959 | 0.8617 | 0.8941 | 0.8776 | 0.9709 | | 0.089 | 1.68 | 1700 | 0.0844 | 0.8982 | 0.9201 | 0.9090 | 0.9760 | | 0.0721 | 1.78 | 1800 | 0.0767 | 0.9 | 0.9095 | 0.9047 | 0.9782 | | 0.0803 | 1.88 | 1900 | 0.0776 | 0.8981 | 0.9340 | 0.9157 | 0.9774 | | 0.0766 | 1.97 | 2000 | 0.0611 | 0.9166 | 0.9315 | 0.9240 | 0.9816 | | 0.0651 | 2.07 | 2100 | 0.0771 | 0.9127 | 0.9454 | 0.9287 | 0.9817 | | 0.0562 | 2.17 | 2200 | 0.0908 | 0.9031 | 0.9112 | 0.9071 | 0.9771 | | 0.0629 | 2.27 | 2300 | 0.0656 | 0.9184 | 0.9356 | 0.9269 | 0.9817 | | 0.0504 | 2.37 | 2400 | 0.0836 | 0.8998 | 0.9299 | 0.9146 | 0.9775 | | 0.0464 | 2.47 | 2500 | 0.0791 | 0.9310 | 0.9340 | 0.9325 | 0.9816 | | 0.0396 | 2.57 | 2600 | 0.0763 | 0.9167 | 0.9234 | 0.9200 | 0.9816 | | 0.0582 | 2.67 | 2700 | 0.0705 | 0.9198 | 0.9446 | 0.9320 | 0.9833 | | 0.0561 | 2.76 | 2800 | 0.0635 | 0.9274 | 0.9470 | 0.9371 | 0.9835 | | 0.0446 | 2.86 | 2900 | 0.0679 | 0.9301 | 0.9438 | 0.9369 | 0.9828 | | 0.0429 | 2.96 | 3000 | 0.0663 | 0.9209 | 0.9397 | 0.9302 | 0.9820 | | 0.0323 | 3.06 | 3100 | 0.0771 | 0.9303 | 0.9462 | 0.9382 | 0.9825 | | 0.0228 | 3.16 | 3200 | 0.0839 | 0.9279 | 0.9446 | 0.9362 | 0.9830 | | 0.0332 | 3.26 | 3300 | 0.0717 | 0.9365 | 0.9495 | 0.9429 | 0.9839 | | 0.0351 | 3.36 | 3400 | 0.0668 | 0.9358 | 0.9381 | 0.9369 | 0.9840 | | 0.0425 | 3.46 | 3500 | 0.0688 | 0.9363 | 0.9462 | 0.9412 | 0.9838 | | 0.0431 | 3.55 | 3600 | 0.0710 | 0.9321 | 0.9503 | 0.9411 | 0.9840 | | 0.0228 | 3.65 | 3700 | 0.0748 | 0.9343 | 0.9511 | 0.9426 | 0.9838 | | 0.0334 | 3.75 | 3800 | 0.0770 | 0.9401 | 0.9462 | 0.9431 | 0.9834 | | 0.0373 | 3.85 | 3900 | 0.0713 | 0.9294 | 0.9446 | 0.9369 | 0.9832 | | 0.0368 | 3.95 | 4000 | 0.0668 | 0.9380 | 0.9495 | 0.9437 | 0.9845 | | 0.0295 | 4.05 | 4100 | 0.0706 | 0.9364 | 0.9487 | 0.9425 | 0.9843 | | 0.0169 | 4.15 | 4200 | 0.0675 | 0.9426 | 0.9503 | 0.9464 | 0.9863 | | 0.0234 | 4.24 | 4300 | 0.0697 | 0.9497 | 0.9544 | 0.9520 | 0.9864 | | 0.0235 | 4.34 | 4400 | 0.0713 | 0.9392 | 0.9576 | 0.9483 | 0.9857 | | 0.0233 | 4.44 | 4500 | 0.0689 | 0.9428 | 0.9544 | 0.9486 | 0.9857 | | 0.015 | 4.54 | 4600 | 0.0744 | 0.9404 | 0.9511 | 0.9457 | 0.9846 | | 0.0154 | 4.64 | 4700 | 0.0753 | 0.9406 | 0.9552 | 0.9478 | 0.9860 | | 0.0235 | 4.74 | 4800 | 0.0733 | 0.9431 | 0.9584 | 0.9507 | 0.9859 | | 0.0239 | 4.84 | 4900 | 0.0728 | 0.9438 | 0.9576 | 0.9506 | 0.9864 | | 0.0237 | 4.94 | 5000 | 0.0727 | 0.9437 | 0.9560 | 0.9498 | 0.9862 | ### Framework versions - Transformers 4.29.0.dev0 - Pytorch 1.10.1+cu113 - Datasets 2.11.0 - Tokenizers 0.13.3