Edit model card

deberta-v3-large-survey-related_passage_consistency-rater-half-gpt4

This model is a fine-tuned version of microsoft/deberta-v3-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0922
  • Krippendorff: 0.9507
  • Spearman: 0.9810
  • Absolute Agreement: 0.9748
  • Agreement Within One: 0.9964

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Krippendorff Spearman Absolute Agreement Agreement Within One
No log 1.0 52 2.0495 -0.3986 nan 0.0833 0.9722
No log 2.0 104 2.0425 -0.3986 nan 0.0833 0.9722
No log 3.0 156 2.0507 -0.3786 -0.3801 0.1667 0.8889
No log 4.0 208 2.6716 -0.3464 nan 0.25 0.8889
No log 5.0 260 3.0110 -0.3464 nan 0.25 0.8889
No log 6.0 312 2.9109 -0.3464 nan 0.25 0.8889
No log 7.0 364 3.3403 -0.3464 nan 0.25 0.8889
No log 8.0 416 2.6483 -0.3464 nan 0.25 0.8889
No log 9.0 468 3.0075 -0.3464 nan 0.25 0.8889
0.8825 10.0 520 3.2051 -0.3464 nan 0.25 0.8889
0.8825 11.0 572 3.0661 -0.3675 -0.2582 0.2222 0.8889
0.8825 12.0 624 3.6486 -0.3570 -0.3583 0.2778 0.8889
0.8825 13.0 676 3.9220 -0.3748 -0.4156 0.2778 0.8889
0.8825 14.0 728 4.3003 -0.3021 -0.2901 0.3056 0.9167
0.8825 15.0 780 4.5146 -0.3021 -0.2901 0.3056 0.9167
0.8825 16.0 832 4.9068 -0.2444 -0.3129 0.25 0.9167
0.8825 17.0 884 4.8181 -0.2444 -0.3129 0.25 0.9167
0.8825 18.0 936 5.1541 -0.2444 -0.3129 0.25 0.9167
0.8825 19.0 988 4.4155 -0.1643 -0.2413 0.25 0.9167
0.2028 20.0 1040 5.5223 -0.2687 -0.3385 0.25 0.9167

Framework versions

  • Transformers 4.26.0
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.12.1
Downloads last month
6