Edit model card

predict-perception-xlmr-cause-none

This model is a fine-tuned version of xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8639
  • Rmse: 1.3661
  • Rmse Cause::a Spontanea, priva di un agente scatenante: 1.3661
  • Mae: 1.0795
  • Mae Cause::a Spontanea, priva di un agente scatenante: 1.0795
  • R2: -1.7872
  • R2 Cause::a Spontanea, priva di un agente scatenante: -1.7872
  • Cos: -0.3043
  • Pair: 0.0
  • Rank: 0.5
  • Neighbors: 0.3501
  • Rsa: nan

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 20
  • eval_batch_size: 8
  • seed: 1996
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Rmse Rmse Cause::a Spontanea, priva di un agente scatenante Mae Mae Cause::a Spontanea, priva di un agente scatenante R2 R2 Cause::a Spontanea, priva di un agente scatenante Cos Pair Rank Neighbors Rsa
1.0626 1.0 15 0.6787 0.8244 0.8244 0.7453 0.7453 -0.0149 -0.0149 0.0435 0.0 0.5 0.2515 nan
1.0186 2.0 30 0.6769 0.8233 0.8233 0.7457 0.7457 -0.0122 -0.0122 0.0435 0.0 0.5 0.2515 nan
1.0346 3.0 45 0.6812 0.8259 0.8259 0.7489 0.7489 -0.0187 -0.0187 0.0435 0.0 0.5 0.2515 nan
0.9481 4.0 60 1.0027 1.0020 1.0020 0.8546 0.8546 -0.4994 -0.4994 -0.3043 0.0 0.5 0.2579 nan
0.8838 5.0 75 0.9352 0.9677 0.9677 0.8463 0.8463 -0.3985 -0.3985 -0.2174 0.0 0.5 0.2966 nan
0.7971 6.0 90 0.9396 0.9700 0.9700 0.8608 0.8608 -0.4050 -0.4050 -0.2174 0.0 0.5 0.3156 nan
0.8182 7.0 105 0.9485 0.9746 0.9746 0.8509 0.8509 -0.4184 -0.4184 -0.1304 0.0 0.5 0.2788 nan
0.696 8.0 120 1.1396 1.0682 1.0682 0.9309 0.9309 -0.7041 -0.7041 -0.1304 0.0 0.5 0.2899 nan
0.6337 9.0 135 1.3064 1.1437 1.1437 0.9612 0.9612 -0.9536 -0.9536 -0.3913 0.0 0.5 0.4018 nan
0.5308 10.0 150 1.2403 1.1144 1.1144 0.9359 0.9359 -0.8547 -0.8547 -0.3913 0.0 0.5 0.4018 nan
0.5226 11.0 165 1.3433 1.1597 1.1597 0.9542 0.9542 -1.0087 -1.0087 -0.3913 0.0 0.5 0.4018 nan
0.474 12.0 180 1.5321 1.2386 1.2386 1.0340 1.0340 -1.2910 -1.2910 -0.3043 0.0 0.5 0.3205 nan
0.3899 13.0 195 1.6322 1.2784 1.2784 1.0083 1.0083 -1.4408 -1.4408 -0.3043 0.0 0.5 0.3590 nan
0.3937 14.0 210 1.7519 1.3244 1.3244 1.0540 1.0540 -1.6197 -1.6197 -0.3913 0.0 0.5 0.4018 nan
0.4128 15.0 225 1.8588 1.3643 1.3643 1.0765 1.0765 -1.7797 -1.7797 -0.3913 0.0 0.5 0.4018 nan
0.3424 16.0 240 1.7211 1.3128 1.3128 1.0217 1.0217 -1.5737 -1.5737 -0.3913 0.0 0.5 0.4018 nan
0.3307 17.0 255 1.7802 1.3351 1.3351 1.0790 1.0790 -1.6621 -1.6621 -0.3043 0.0 0.5 0.3205 nan
0.2972 18.0 270 1.5272 1.2366 1.2366 0.9945 0.9945 -1.2837 -1.2837 -0.3043 0.0 0.5 0.3501 nan
0.2862 19.0 285 1.7213 1.3128 1.3128 1.0574 1.0574 -1.5740 -1.5740 -0.3913 0.0 0.5 0.3815 nan
0.2844 20.0 300 1.8999 1.3793 1.3793 1.0930 1.0930 -1.8411 -1.8411 -0.3043 0.0 0.5 0.3501 nan
0.2404 21.0 315 1.9806 1.4082 1.4082 1.1221 1.1221 -1.9617 -1.9617 -0.3913 0.0 0.5 0.3815 nan
0.2349 22.0 330 1.8649 1.3665 1.3665 1.0953 1.0953 -1.7888 -1.7888 -0.3913 0.0 0.5 0.3815 nan
0.2323 23.0 345 1.8256 1.3520 1.3520 1.0694 1.0694 -1.7299 -1.7299 -0.3913 0.0 0.5 0.4018 nan
0.2217 24.0 360 1.9150 1.3847 1.3847 1.1017 1.1017 -1.8636 -1.8636 -0.3043 0.0 0.5 0.3501 nan
0.2262 25.0 375 1.8536 1.3624 1.3624 1.0667 1.0667 -1.7719 -1.7719 -0.3043 0.0 0.5 0.3501 nan
0.2052 26.0 390 1.7727 1.3323 1.3323 1.0475 1.0475 -1.6508 -1.6508 -0.3043 0.0 0.5 0.3501 nan
0.2121 27.0 405 1.8088 1.3458 1.3458 1.0588 1.0588 -1.7048 -1.7048 -0.3043 0.0 0.5 0.3501 nan
0.1723 28.0 420 1.8283 1.3530 1.3530 1.0628 1.0628 -1.7340 -1.7340 -0.3043 0.0 0.5 0.3501 nan
0.1932 29.0 435 1.8566 1.3635 1.3635 1.0763 1.0763 -1.7764 -1.7764 -0.3043 0.0 0.5 0.3501 nan
0.2157 30.0 450 1.8639 1.3661 1.3661 1.0795 1.0795 -1.7872 -1.7872 -0.3043 0.0 0.5 0.3501 nan

Framework versions

  • Transformers 4.16.2
  • Pytorch 1.10.2+cu113
  • Datasets 1.18.3
  • Tokenizers 0.11.0
Downloads last month
24