Edit model card

deberta-v3-large-survey-new_fact_related_passage-rater-all

This model is a fine-tuned version of microsoft/deberta-v3-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4933
  • Krippendorff: 0.8215
  • Spearman: 0.8423
  • Absolute Agreement: 0.8468
  • Agreement Within One: 0.9389

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Krippendorff Spearman Absolute Agreement Agreement Within One
No log 1.0 55 1.9220 -0.5318 nan 0.1806 1.0
No log 2.0 110 1.9208 -0.5318 nan 0.1806 1.0
No log 3.0 165 1.9200 -0.5318 nan 0.1806 1.0
No log 4.0 220 1.9197 -0.5318 nan 0.1806 1.0
No log 5.0 275 1.9655 -0.5318 nan 0.1806 1.0
No log 6.0 330 1.9944 -0.4584 -0.1731 0.1528 0.9167
No log 7.0 385 1.9699 -0.3160 -0.3151 0.1389 0.6944
No log 8.0 440 1.8888 -0.3710 -0.3364 0.1667 0.6806
No log 9.0 495 1.8273 0.1437 0.2775 0.375 0.625
1.6819 10.0 550 1.7380 -0.0391 0.1674 0.3472 0.5833
1.6819 11.0 605 1.6954 -0.3105 0.0146 0.3333 0.5417
1.6819 12.0 660 1.6211 0.3129 0.3446 0.4028 0.6806
1.6819 13.0 715 1.5305 0.2303 0.4155 0.4028 0.625
1.6819 14.0 770 1.4929 0.3973 0.4054 0.4306 0.7083
1.6819 15.0 825 1.4731 0.3220 0.3517 0.4167 0.6806
1.6819 16.0 880 1.4963 0.0524 0.3087 0.3889 0.5972
1.6819 17.0 935 1.4038 0.3960 0.4802 0.4722 0.6944
1.6819 18.0 990 1.3351 0.5195 0.4801 0.5139 0.7778
0.9585 19.0 1045 1.3013 0.4739 0.5167 0.5139 0.7361
0.9585 20.0 1100 1.2788 0.4785 0.4990 0.4861 0.7361

Framework versions

  • Transformers 4.26.0
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.12.1
Downloads last month
14