Edit model card

XMLRobertaLexical-finetuned_70KURL

This model is a fine-tuned version of FacebookAI/xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2650
  • Accuracy: 0.9691
  • F1: 0.9692

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.2326 200 0.1878 0.9387 0.9389
No log 0.4651 400 0.1532 0.9467 0.9474
No log 0.6977 600 0.1399 0.9565 0.9569
No log 0.9302 800 0.1293 0.9615 0.9616
0.204 1.1628 1000 0.1645 0.9615 0.9616
0.204 1.3953 1200 0.1162 0.9634 0.9636
0.204 1.6279 1400 0.1178 0.9632 0.9635
0.204 1.8605 1600 0.1076 0.9672 0.9673
0.1192 2.0930 1800 0.1150 0.9660 0.9661
0.1192 2.3256 2000 0.1125 0.9676 0.9678
0.1192 2.5581 2200 0.1148 0.9654 0.9655
0.1192 2.7907 2400 0.1114 0.9682 0.9683
0.098 3.0233 2600 0.1177 0.9662 0.9661
0.098 3.2558 2800 0.1199 0.9650 0.9652
0.098 3.4884 3000 0.1050 0.9702 0.9703
0.098 3.7209 3200 0.1018 0.9677 0.9678
0.098 3.9535 3400 0.1201 0.9674 0.9675
0.0837 4.1860 3600 0.1054 0.9687 0.9687
0.0837 4.4186 3800 0.1205 0.9679 0.9682
0.0837 4.6512 4000 0.1295 0.9686 0.9686
0.0837 4.8837 4200 0.1346 0.9659 0.9659
0.0713 5.1163 4400 0.1298 0.9662 0.9663
0.0713 5.3488 4600 0.1220 0.9671 0.9671
0.0713 5.5814 4800 0.1352 0.9657 0.9656
0.0713 5.8140 5000 0.1745 0.9622 0.9626
0.0617 6.0465 5200 0.1428 0.9662 0.9662
0.0617 6.2791 5400 0.1346 0.9678 0.9678
0.0617 6.5116 5600 0.1428 0.9689 0.9691
0.0617 6.7442 5800 0.1550 0.9647 0.9645
0.0617 6.9767 6000 0.1326 0.9687 0.9688
0.0538 7.2093 6200 0.1412 0.9650 0.9650
0.0538 7.4419 6400 0.1502 0.9684 0.9686
0.0538 7.6744 6600 0.1285 0.9671 0.9672
0.0538 7.9070 6800 0.1344 0.9691 0.9691
0.0456 8.1395 7000 0.1472 0.9681 0.9681
0.0456 8.3721 7200 0.1548 0.9670 0.9671
0.0456 8.6047 7400 0.1736 0.9679 0.9681
0.0456 8.8372 7600 0.1827 0.9674 0.9675
0.0387 9.0698 7800 0.1919 0.9686 0.9686
0.0387 9.3023 8000 0.1839 0.9689 0.9690
0.0387 9.5349 8200 0.1987 0.9674 0.9673
0.0387 9.7674 8400 0.1908 0.9679 0.9681
0.0315 10.0 8600 0.1880 0.9678 0.9680
0.0315 10.2326 8800 0.2043 0.9689 0.9691
0.0315 10.4651 9000 0.1936 0.9688 0.9688
0.0315 10.6977 9200 0.1643 0.9674 0.9675
0.0315 10.9302 9400 0.1884 0.9680 0.9680
0.0281 11.1628 9600 0.1910 0.9672 0.9674
0.0281 11.3953 9800 0.1883 0.9676 0.9676
0.0281 11.6279 10000 0.1829 0.9692 0.9693
0.0281 11.8605 10200 0.1896 0.9674 0.9675
0.0246 12.0930 10400 0.2215 0.9677 0.9677
0.0246 12.3256 10600 0.2046 0.9669 0.9670
0.0246 12.5581 10800 0.2104 0.9673 0.9674
0.0246 12.7907 11000 0.2025 0.9687 0.9688
0.0214 13.0233 11200 0.2057 0.9678 0.9678
0.0214 13.2558 11400 0.2248 0.9661 0.9660
0.0214 13.4884 11600 0.2242 0.9690 0.9690
0.0214 13.7209 11800 0.2218 0.9677 0.9677
0.0214 13.9535 12000 0.2150 0.9665 0.9664
0.0189 14.1860 12200 0.2232 0.9683 0.9684
0.0189 14.4186 12400 0.2241 0.9687 0.9688
0.0189 14.6512 12600 0.2155 0.9677 0.9678
0.0189 14.8837 12800 0.2133 0.9664 0.9664
0.0167 15.1163 13000 0.2243 0.9680 0.9680
0.0167 15.3488 13200 0.2238 0.9686 0.9686
0.0167 15.5814 13400 0.2407 0.9672 0.9672
0.0167 15.8140 13600 0.2383 0.9659 0.9658
0.014 16.0465 13800 0.2371 0.9671 0.9671
0.014 16.2791 14000 0.2507 0.9676 0.9676
0.014 16.5116 14200 0.2412 0.9674 0.9675
0.014 16.7442 14400 0.2475 0.9687 0.9687
0.014 16.9767 14600 0.2508 0.9679 0.9679
0.0115 17.2093 14800 0.2537 0.9691 0.9691
0.0115 17.4419 15000 0.2574 0.9679 0.9680
0.0115 17.6744 15200 0.2586 0.9679 0.9680
0.0115 17.9070 15400 0.2563 0.9675 0.9675
0.0102 18.1395 15600 0.2566 0.9685 0.9685
0.0102 18.3721 15800 0.2595 0.9695 0.9696
0.0102 18.6047 16000 0.2619 0.9687 0.9687
0.0102 18.8372 16200 0.2643 0.9679 0.9679
0.0093 19.0698 16400 0.2721 0.9688 0.9688
0.0093 19.3023 16600 0.2641 0.9683 0.9684
0.0093 19.5349 16800 0.2674 0.9686 0.9686
0.0093 19.7674 17000 0.2653 0.9687 0.9687
0.0097 20.0 17200 0.2650 0.9691 0.9692

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.1.2
  • Datasets 2.18.0
  • Tokenizers 0.19.1
Downloads last month
186
Safetensors
Model size
278M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Finetuned from