LecJackS's picture
update model card README.md
d9f2004
metadata
license: mit
tags:
  - generated_from_trainer
datasets:
  - conll2003
metrics:
  - f1
model-index:
  - name: xlm-roberta-base-finetuned-conll2003
    results:
      - task:
          name: Token Classification
          type: token-classification
        dataset:
          name: conll2003
          type: conll2003
          config: conll2003
          split: validation
          args: conll2003
        metrics:
          - name: F1
            type: f1
            value: 0.948444966049124

xlm-roberta-base-finetuned-conll2003

This model is a fine-tuned version of xlm-roberta-base on the conll2003 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0898
  • F1: 0.9484

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss F1
0.1415 1.0 439 0.0447 0.9367
0.0429 2.0 878 0.0437 0.9310
0.0259 3.0 1317 0.0534 0.9328
0.0195 4.0 1756 0.0449 0.9429
0.0146 5.0 2195 0.0484 0.9421
0.0121 6.0 2634 0.0523 0.9392
0.0099 7.0 3073 0.0500 0.9428
0.0077 8.0 3512 0.0536 0.9423
0.008 9.0 3951 0.0672 0.9254
0.0079 10.0 4390 0.0589 0.9442
0.007 11.0 4829 0.0669 0.9400
0.0051 12.0 5268 0.0602 0.9409
0.0052 13.0 5707 0.0639 0.9441
0.0036 14.0 6146 0.0635 0.9431
0.0033 15.0 6585 0.0858 0.9328
0.0038 16.0 7024 0.0653 0.9478
0.0047 17.0 7463 0.0689 0.9431
0.0039 18.0 7902 0.0687 0.9442
0.0031 19.0 8341 0.0687 0.9459
0.0027 20.0 8780 0.0785 0.9424
0.0047 21.0 9219 0.0654 0.9444
0.0035 22.0 9658 0.0748 0.9454
0.0021 23.0 10097 0.0714 0.9423
0.003 24.0 10536 0.0730 0.9433
0.0031 25.0 10975 0.0682 0.9417
0.0021 26.0 11414 0.0762 0.9407
0.0025 27.0 11853 0.0773 0.9391
0.0019 28.0 12292 0.0739 0.9420
0.0032 29.0 12731 0.0755 0.9413
0.0023 30.0 13170 0.0755 0.9439
0.0024 31.0 13609 0.0747 0.9456
0.0018 32.0 14048 0.0730 0.9430
0.0017 33.0 14487 0.0866 0.9385
0.0019 34.0 14926 0.0695 0.9440
0.0016 35.0 15365 0.0818 0.9442
0.0034 36.0 15804 0.0750 0.9459
0.0019 37.0 16243 0.0808 0.9414
0.0013 38.0 16682 0.0797 0.9422
0.0015 39.0 17121 0.0814 0.9394
0.0019 40.0 17560 0.0757 0.9415
0.0011 41.0 17999 0.0778 0.9453
0.0011 42.0 18438 0.0825 0.9407
0.0012 43.0 18877 0.0767 0.9458
0.0022 44.0 19316 0.0865 0.9396
0.0009 45.0 19755 0.0826 0.9459
0.0008 46.0 20194 0.0819 0.9473
0.0017 47.0 20633 0.0844 0.9420
0.0015 48.0 21072 0.0827 0.9448
0.0014 49.0 21511 0.0800 0.9464
0.0008 50.0 21950 0.0770 0.9474
0.0011 51.0 22389 0.0766 0.9471
0.0006 52.0 22828 0.0896 0.9424
0.0011 53.0 23267 0.0866 0.9425
0.001 54.0 23706 0.0853 0.9426
0.0007 55.0 24145 0.0831 0.9462
0.0008 56.0 24584 0.0805 0.9457
0.0008 57.0 25023 0.0866 0.9438
0.0008 58.0 25462 0.0822 0.9421
0.0011 59.0 25901 0.0837 0.9417
0.0007 60.0 26340 0.0823 0.9466
0.0008 61.0 26779 0.0825 0.9425
0.0004 62.0 27218 0.0825 0.9433
0.0005 63.0 27657 0.0826 0.9435
0.0004 64.0 28096 0.0838 0.9437
0.0008 65.0 28535 0.0909 0.9424
0.0004 66.0 28974 0.0825 0.9464
0.0004 67.0 29413 0.0917 0.9454
0.0004 68.0 29852 0.0843 0.9487
0.0005 69.0 30291 0.0825 0.9481
0.0003 70.0 30730 0.0825 0.9456
0.0005 71.0 31169 0.0835 0.9460
0.0003 72.0 31608 0.0906 0.9481
0.0001 73.0 32047 0.0916 0.9471
0.0007 74.0 32486 0.0885 0.9460
0.0003 75.0 32925 0.0879 0.9481
0.0001 76.0 33364 0.0871 0.9505
0.0002 77.0 33803 0.0906 0.9486
0.0003 78.0 34242 0.0934 0.9469
0.0002 79.0 34681 0.0911 0.9466
0.0003 80.0 35120 0.0871 0.9489
0.0003 81.0 35559 0.0876 0.9494
0.0002 82.0 35998 0.0884 0.9482
0.0001 83.0 36437 0.0910 0.9469
0.0002 84.0 36876 0.0874 0.9473
0.0002 85.0 37315 0.0864 0.9463
0.0001 86.0 37754 0.0878 0.9472
0.0002 87.0 38193 0.0836 0.9500
0.0001 88.0 38632 0.0861 0.9495
0.0001 89.0 39071 0.0869 0.9503
0.0001 90.0 39510 0.0878 0.9480
0.0001 91.0 39949 0.0878 0.9501
0.0 92.0 40388 0.0886 0.9477
0.0001 93.0 40827 0.0884 0.9497
0.0001 94.0 41266 0.0897 0.9487
0.0001 95.0 41705 0.0896 0.9490
0.0001 96.0 42144 0.0879 0.9499
0.0001 97.0 42583 0.0884 0.9490
0.0001 98.0 43022 0.0899 0.9486
0.0001 99.0 43461 0.0897 0.9488
0.0001 100.0 43900 0.0898 0.9484

Framework versions

  • Transformers 4.29.1
  • Pytorch 2.0.0+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3