xlm-roberta-base-finetuned-ner

This model was trained from scratch on the wikiann dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5647
  • Precision: 0.8684
  • Recall: 0.8656
  • F1: 0.8670
  • Accuracy: 0.9450

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 3
  • eval_batch_size: 3
  • seed: 42
  • distributed_type: multi-GPU
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
0.3537 1.0 6667 0.3621 0.7951 0.8054 0.8002 0.9187
0.2468 2.0 13334 0.3024 0.8341 0.8451 0.8396 0.9359
0.1705 3.0 20001 0.3255 0.8401 0.8328 0.8364 0.9365
0.1388 4.0 26668 0.3530 0.8438 0.8527 0.8482 0.9389
0.0979 5.0 33335 0.3980 0.8445 0.8542 0.8494 0.9390
0.0946 6.0 40002 0.3863 0.8500 0.8622 0.8560 0.9426
0.0908 7.0 46669 0.3991 0.8519 0.8633 0.8576 0.9420
0.0712 8.0 53336 0.4065 0.8617 0.8551 0.8584 0.9424
0.0568 9.0 60003 0.4348 0.8441 0.8663 0.8551 0.9413
0.0448 10.0 66670 0.4661 0.8429 0.8603 0.8515 0.9404
0.0687 11.0 73337 0.4482 0.8561 0.8621 0.8591 0.9431
0.0552 12.0 80004 0.4527 0.8499 0.8619 0.8558 0.9405
0.059 13.0 86671 0.4688 0.8564 0.8592 0.8578 0.9428
0.0362 14.0 93338 0.4593 0.8705 0.8615 0.8660 0.9451
0.0407 15.0 100005 0.4661 0.8647 0.8674 0.8660 0.9449
0.0278 16.0 106672 0.4794 0.8670 0.8707 0.8688 0.9457
0.0425 17.0 113339 0.5056 0.8548 0.8698 0.8622 0.9440
0.0251 18.0 120006 0.4630 0.8658 0.8603 0.8630 0.9442
0.0207 19.0 126673 0.5077 0.8515 0.8574 0.8544 0.9420
0.0245 20.0 133340 0.5130 0.8630 0.8646 0.8638 0.9437
0.051 21.0 140007 0.5233 0.8578 0.8644 0.8611 0.9423
0.0381 22.0 146674 0.5269 0.8688 0.8635 0.8661 0.9433
0.0144 23.0 153341 0.5137 0.8572 0.8668 0.8620 0.9443
0.0237 24.0 160008 0.5121 0.8741 0.8552 0.8645 0.9443
0.0175 25.0 166675 0.5019 0.8665 0.8725 0.8695 0.9467
0.0268 26.0 173342 0.5247 0.8597 0.8696 0.8646 0.9433
0.0128 27.0 180009 0.5075 0.8696 0.8704 0.8700 0.9461
0.0299 28.0 186676 0.5066 0.8647 0.8636 0.8641 0.9444
0.018 29.0 193343 0.5421 0.8677 0.8609 0.8643 0.9432
0.0264 30.0 200010 0.5023 0.8479 0.8731 0.8603 0.9424
0.0169 31.0 206677 0.5215 0.8672 0.8653 0.8662 0.9435
0.0185 32.0 213344 0.5184 0.8698 0.8630 0.8664 0.9457
0.0159 33.0 220011 0.4930 0.8653 0.8662 0.8657 0.9448
0.026 34.0 226678 0.4976 0.8579 0.8794 0.8685 0.9456
0.016 35.0 233345 0.5671 0.8517 0.8689 0.8602 0.9421
0.0186 36.0 240012 0.4881 0.8706 0.8752 0.8729 0.9467
0.0253 37.0 246679 0.5351 0.8621 0.8725 0.8673 0.9447
0.0086 38.0 253346 0.5759 0.8742 0.8612 0.8677 0.9440
0.0157 39.0 260013 0.5362 0.8549 0.8696 0.8622 0.9436
0.0107 40.0 266680 0.5734 0.8730 0.8582 0.8655 0.9438
0.0139 41.0 273347 0.4995 0.8622 0.8729 0.8675 0.9457
0.0141 42.0 280014 0.5567 0.8651 0.8671 0.8661 0.9448
0.0146 43.0 286681 0.5124 0.8673 0.8691 0.8682 0.9460
0.0125 44.0 293348 0.5511 0.8568 0.8758 0.8662 0.9440
0.0153 45.0 300015 0.5385 0.8597 0.8720 0.8658 0.9445
0.017 46.0 306682 0.5302 0.8633 0.8714 0.8673 0.9448
0.0121 47.0 313349 0.5302 0.8604 0.8666 0.8635 0.9441
0.0136 48.0 320016 0.5639 0.8481 0.8677 0.8578 0.9404
0.0107 49.0 326683 0.5403 0.8731 0.8648 0.8689 0.9457
0.0083 50.0 333350 0.5615 0.8770 0.8581 0.8675 0.9431
0.0121 51.0 340017 0.5489 0.8512 0.8730 0.8620 0.9439
0.0079 52.0 346684 0.5328 0.8599 0.8736 0.8667 0.9458
0.0139 53.0 353351 0.5572 0.8665 0.8631 0.8648 0.9441
0.0138 54.0 360018 0.5128 0.8662 0.8740 0.8701 0.9468
0.014 55.0 366685 0.5603 0.8798 0.8662 0.8730 0.9460
0.0319 56.0 373352 0.5508 0.8631 0.8688 0.8659 0.9427
0.0152 57.0 380019 0.5716 0.8596 0.8644 0.8620 0.9429
0.0249 58.0 386686 0.5692 0.8595 0.8749 0.8671 0.9453
0.0161 59.0 393353 0.5483 0.8665 0.8715 0.8690 0.9463
0.0157 60.0 400020 0.5588 0.8603 0.8800 0.8701 0.9463
0.0247 61.0 406687 0.5265 0.8510 0.8662 0.8585 0.9417
0.0069 62.0 413354 0.5578 0.8681 0.8679 0.8680 0.9459
0.0254 63.0 420021 0.5756 0.8620 0.8646 0.8633 0.9435
0.0182 64.0 426688 0.5323 0.8651 0.8762 0.8707 0.9458
0.0237 65.0 433355 0.5342 0.8592 0.8724 0.8657 0.9443
0.0234 66.0 440022 0.5458 0.8653 0.8679 0.8666 0.9437
0.0159 67.0 446689 0.5166 0.8781 0.8624 0.8702 0.9448
0.0204 68.0 453356 0.5499 0.8658 0.8723 0.8690 0.9452
0.0117 69.0 460023 0.5573 0.8572 0.8714 0.8642 0.9432
0.0062 70.0 466690 0.5887 0.8592 0.8675 0.8633 0.9422
0.0123 71.0 473357 0.5138 0.8600 0.8699 0.8649 0.9448
0.0079 72.0 480024 0.5548 0.8610 0.8724 0.8666 0.9447
0.0061 73.0 486691 0.5872 0.8476 0.8675 0.8574 0.9415
0.0129 74.0 493358 0.5520 0.8727 0.8595 0.8661 0.9449
0.0159 75.0 500025 0.5427 0.8611 0.8674 0.8642 0.9435
0.0258 76.0 506692 0.5402 0.8672 0.8702 0.8687 0.9448
0.0151 77.0 513359 0.5589 0.8681 0.8704 0.8693 0.9457
0.0075 78.0 520026 0.5754 0.8613 0.8682 0.8647 0.9438
0.0076 79.0 526693 0.5709 0.8608 0.8646 0.8627 0.9445
0.0196 80.0 533360 0.5252 0.8714 0.8706 0.8710 0.9461
0.0123 81.0 540027 0.5857 0.8637 0.8631 0.8634 0.9437
0.0205 82.0 546694 0.5805 0.8642 0.8655 0.8648 0.9431
0.0065 83.0 553361 0.5815 0.8619 0.8626 0.8622 0.9431
0.0128 84.0 560028 0.6305 0.8498 0.8646 0.8571 0.9402
0.0118 85.0 566695 0.5620 0.8648 0.8682 0.8665 0.9445
0.0173 86.0 573362 0.5714 0.8655 0.8657 0.8656 0.9442
0.0107 87.0 580029 0.5845 0.8603 0.8649 0.8626 0.9418
0.0218 88.0 586696 0.5259 0.8708 0.8697 0.8703 0.9449
0.0039 89.0 593363 0.5809 0.8800 0.8648 0.8723 0.9465
0.0076 90.0 600030 0.5852 0.8744 0.8615 0.8679 0.9443
0.008 91.0 606697 0.5540 0.8689 0.8683 0.8686 0.9454
0.0114 92.0 613364 0.5836 0.8578 0.8639 0.8609 0.9422
0.0245 93.0 620031 0.5808 0.8735 0.8672 0.8703 0.9450
0.0142 94.0 626698 0.5846 0.8630 0.8692 0.8661 0.9429
0.0013 95.0 633365 0.5495 0.8656 0.8605 0.8630 0.9432
0.0093 96.0 640032 0.6049 0.8660 0.8656 0.8658 0.9436
0.012 97.0 646699 0.5802 0.8633 0.8618 0.8626 0.9427
0.0042 98.0 653366 0.5851 0.8571 0.8658 0.8615 0.9422
0.0143 99.0 660033 0.5619 0.8671 0.8626 0.8649 0.9437
0.0173 100.0 666700 0.5647 0.8684 0.8656 0.8670 0.9450

Framework versions

  • Transformers 4.9.2
  • Pytorch 1.9.0+cu111
  • Datasets 1.11.0
  • Tokenizers 0.10.3
New

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
11
Hosted inference API
Token Classification
This model can be loaded on the Inference API on-demand.