Edit model card

clinico-xlm-roberta-finetuned

This model is a fine-tuned version of joheras/xlm-roberta-base-finetuned-clinais on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1866
  • Precision: 0.4629
  • Recall: 0.6281
  • F1: 0.5330
  • Accuracy: 0.8501

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 25 1.2657 0.0046 0.0103 0.0064 0.5444
No log 2.0 50 0.7933 0.1430 0.2609 0.1848 0.7711
No log 3.0 75 0.6467 0.2741 0.4325 0.3356 0.8061
No log 4.0 100 0.5961 0.3151 0.5217 0.3929 0.8233
No log 5.0 125 0.5628 0.3288 0.5217 0.4034 0.8289
No log 6.0 150 0.5540 0.2884 0.4920 0.3636 0.8377
No log 7.0 175 0.5475 0.2960 0.4954 0.3706 0.8381
No log 8.0 200 0.6013 0.3034 0.5297 0.3858 0.8347
No log 9.0 225 0.6026 0.2989 0.5297 0.3822 0.8368
No log 10.0 250 0.6055 0.3352 0.5366 0.4127 0.8422
No log 11.0 275 0.6757 0.2982 0.5275 0.3810 0.8385
No log 12.0 300 0.6287 0.3135 0.5355 0.3954 0.8464
No log 13.0 325 0.7429 0.3441 0.5492 0.4231 0.8402
No log 14.0 350 0.6883 0.3203 0.5538 0.4059 0.8491
No log 15.0 375 0.7311 0.3550 0.5698 0.4374 0.8427
No log 16.0 400 0.7084 0.3518 0.5595 0.4320 0.8481
No log 17.0 425 0.7104 0.3545 0.5629 0.4350 0.8533
No log 18.0 450 0.7958 0.3572 0.5709 0.4395 0.8381
No log 19.0 475 0.7453 0.3616 0.5755 0.4442 0.8516
0.3605 20.0 500 0.7714 0.3573 0.5744 0.4405 0.8430
0.3605 21.0 525 0.8162 0.3664 0.5744 0.4474 0.8469
0.3605 22.0 550 0.7999 0.3711 0.5847 0.4540 0.8527
0.3605 23.0 575 0.8143 0.3968 0.5938 0.4757 0.8537
0.3605 24.0 600 0.8394 0.4078 0.5892 0.4820 0.8516
0.3605 25.0 625 0.8772 0.3778 0.5675 0.4536 0.8397
0.3605 26.0 650 0.8670 0.3991 0.6178 0.4850 0.8549
0.3605 27.0 675 0.8739 0.3886 0.5904 0.4687 0.8491
0.3605 28.0 700 0.9461 0.4081 0.5973 0.4849 0.8447
0.3605 29.0 725 0.9134 0.4267 0.6064 0.5009 0.8448
0.3605 30.0 750 0.9127 0.4057 0.5984 0.4836 0.8440
0.3605 31.0 775 0.9738 0.4129 0.5995 0.4890 0.8435
0.3605 32.0 800 1.0001 0.4074 0.5892 0.4818 0.8442
0.3605 33.0 825 0.9532 0.4133 0.6030 0.4905 0.8470
0.3605 34.0 850 0.9532 0.4080 0.6041 0.4871 0.8481
0.3605 35.0 875 0.9876 0.4108 0.6087 0.4905 0.8483
0.3605 36.0 900 0.9456 0.4219 0.6247 0.5037 0.8521
0.3605 37.0 925 0.9513 0.4180 0.6121 0.4968 0.8468
0.3605 38.0 950 0.9905 0.4120 0.6110 0.4922 0.8506
0.3605 39.0 975 0.9983 0.4365 0.6247 0.5139 0.8522
0.0271 40.0 1000 1.0220 0.4224 0.6076 0.4984 0.8480
0.0271 41.0 1025 1.0323 0.4114 0.6110 0.4917 0.8474
0.0271 42.0 1050 1.0651 0.4266 0.6121 0.5028 0.8482
0.0271 43.0 1075 1.0778 0.4101 0.5927 0.4848 0.8534
0.0271 44.0 1100 1.0190 0.4216 0.6087 0.4981 0.8469
0.0271 45.0 1125 1.0374 0.4245 0.6144 0.5021 0.8544
0.0271 46.0 1150 1.0792 0.4383 0.6018 0.5072 0.8518
0.0271 47.0 1175 1.0888 0.4267 0.6190 0.5051 0.8478
0.0271 48.0 1200 1.1022 0.4498 0.6156 0.5198 0.8490
0.0271 49.0 1225 1.1646 0.4398 0.6064 0.5099 0.8453
0.0271 50.0 1250 1.1448 0.4505 0.6087 0.5178 0.8478
0.0271 51.0 1275 1.1288 0.4388 0.6110 0.5108 0.8455
0.0271 52.0 1300 1.1077 0.4579 0.6224 0.5276 0.8478
0.0271 53.0 1325 1.0931 0.4373 0.6064 0.5081 0.8465
0.0271 54.0 1350 1.1044 0.4478 0.6087 0.5160 0.8471
0.0271 55.0 1375 1.0895 0.4343 0.6087 0.5069 0.8500
0.0271 56.0 1400 1.0768 0.4501 0.6144 0.5196 0.8532
0.0271 57.0 1425 1.1164 0.4356 0.6190 0.5113 0.8510
0.0271 58.0 1450 1.1378 0.4507 0.6167 0.5208 0.8505
0.0271 59.0 1475 1.1510 0.4583 0.6156 0.5254 0.8500
0.0063 60.0 1500 1.1126 0.4654 0.6224 0.5326 0.8514
0.0063 61.0 1525 1.1535 0.4548 0.6156 0.5231 0.8515
0.0063 62.0 1550 1.1362 0.4535 0.6247 0.5255 0.8505
0.0063 63.0 1575 1.1321 0.4723 0.6247 0.5379 0.8546
0.0063 64.0 1600 1.0995 0.4626 0.6304 0.5337 0.8561
0.0063 65.0 1625 1.1263 0.4546 0.6190 0.5242 0.8498
0.0063 66.0 1650 1.1251 0.4712 0.6270 0.5380 0.8549
0.0063 67.0 1675 1.1592 0.4745 0.6281 0.5406 0.8501
0.0063 68.0 1700 1.1552 0.4571 0.6281 0.5292 0.8514
0.0063 69.0 1725 1.1602 0.4618 0.6224 0.5302 0.8520
0.0063 70.0 1750 1.1631 0.4669 0.6304 0.5365 0.8527
0.0063 71.0 1775 1.1784 0.4824 0.6259 0.5448 0.8487
0.0063 72.0 1800 1.1779 0.4681 0.6213 0.5339 0.8527
0.0063 73.0 1825 1.1656 0.4478 0.6236 0.5213 0.8531
0.0063 74.0 1850 1.1743 0.4620 0.6190 0.5291 0.8528
0.0063 75.0 1875 1.1623 0.4529 0.6270 0.5259 0.8520
0.0063 76.0 1900 1.1597 0.4831 0.6201 0.5431 0.8507
0.0063 77.0 1925 1.1603 0.4743 0.6236 0.5388 0.8520
0.0063 78.0 1950 1.1551 0.4505 0.6190 0.5214 0.8500
0.0063 79.0 1975 1.1740 0.4772 0.6213 0.5398 0.8511
0.0026 80.0 2000 1.1463 0.4706 0.6224 0.5360 0.8519
0.0026 81.0 2025 1.1757 0.4603 0.6167 0.5271 0.8472
0.0026 82.0 2050 1.1754 0.4541 0.6224 0.5251 0.8457
0.0026 83.0 2075 1.1713 0.4588 0.6178 0.5266 0.8476
0.0026 84.0 2100 1.2023 0.4631 0.6247 0.5319 0.8473
0.0026 85.0 2125 1.1819 0.4841 0.6259 0.5459 0.8471
0.0026 86.0 2150 1.1878 0.4611 0.6236 0.5302 0.8470
0.0026 87.0 2175 1.1827 0.4694 0.6236 0.5356 0.8485
0.0026 88.0 2200 1.1787 0.4552 0.6213 0.5254 0.8506
0.0026 89.0 2225 1.1811 0.4762 0.6293 0.5421 0.8488
0.0026 90.0 2250 1.1849 0.4573 0.6247 0.5280 0.8493
0.0026 91.0 2275 1.1779 0.4505 0.6247 0.5235 0.8502
0.0026 92.0 2300 1.2042 0.4672 0.6201 0.5329 0.8493
0.0026 93.0 2325 1.1955 0.4712 0.6270 0.5380 0.8501
0.0026 94.0 2350 1.1950 0.4696 0.6281 0.5374 0.8503
0.0026 95.0 2375 1.1958 0.4769 0.6270 0.5418 0.8489
0.0026 96.0 2400 1.1819 0.4564 0.6281 0.5286 0.8496
0.0026 97.0 2425 1.1853 0.4677 0.6304 0.5370 0.8501
0.0026 98.0 2450 1.1822 0.4637 0.6281 0.5335 0.8501
0.0026 99.0 2475 1.1841 0.4571 0.6281 0.5292 0.8498
0.0014 100.0 2500 1.1866 0.4629 0.6281 0.5330 0.8501

Framework versions

  • Transformers 4.27.0.dev0
  • Pytorch 1.13.0
  • Datasets 2.8.0
  • Tokenizers 0.12.1
Downloads last month
1