Edit model card

deberta-v3-large-survey-cross_passage_consistency-rater-half-gpt4

This model is a fine-tuned version of microsoft/deberta-v3-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5957
  • Krippendorff: 0.5530
  • Spearman: 0.6991
  • Absolute Agreement: 0.8221
  • Agreement Within One: 0.9002

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Krippendorff Spearman Absolute Agreement Agreement Within One
No log 1.0 52 1.9600 -0.4776 0.3301 0.0556 0.9722
No log 2.0 104 1.9258 -0.2795 0.2827 0.0833 0.9722
No log 3.0 156 1.9351 -0.3095 0.2873 0.0833 0.9722
No log 4.0 208 2.0079 -0.1471 -0.1057 0.0833 0.8333
No log 5.0 260 2.2469 -0.2860 nan 0.1944 0.8056
No log 6.0 312 2.2186 -0.2860 nan 0.1944 0.8056
No log 7.0 364 2.2266 -0.2860 nan 0.1944 0.8056
No log 8.0 416 2.2258 -0.2860 nan 0.1944 0.8056
No log 9.0 468 2.2048 -0.2860 nan 0.1944 0.8056
1.3796 10.0 520 2.2347 -0.2860 nan 0.1944 0.8056
1.3796 11.0 572 2.2480 -0.2860 nan 0.1944 0.8056
1.3796 12.0 624 2.1409 -0.2349 0.0 0.1944 0.8056
1.3796 13.0 676 2.0869 0.0452 0.1218 0.1944 0.8333
1.3796 14.0 728 2.1785 0.1597 0.2372 0.1944 0.8611
1.3796 15.0 780 2.1684 0.0452 0.1218 0.1944 0.8333
1.3796 16.0 832 2.3356 0.0185 0.1110 0.1944 0.8333
1.3796 17.0 884 2.5980 -0.0694 0.0783 0.2222 0.8333
1.3796 18.0 936 2.3700 0.0944 0.2085 0.1944 0.8611
1.3796 19.0 988 2.3355 -0.0396 0.0913 0.1667 0.8611
0.7039 20.0 1040 2.7088 -0.0625 0.0901 0.1667 0.8611

Framework versions

  • Transformers 4.26.0
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.12.1
Downloads last month
0