Edit model card

deberta-v3-large-survey-main_passage_consistency-rater-gpt4

This model is a fine-tuned version of microsoft/deberta-v3-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3012
  • Krippendorff: 0.8426
  • Spearman: 0.8656
  • Absolute Agreement: 0.9183
  • Agreement Within One: 0.9673

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Krippendorff Spearman Absolute Agreement Agreement Within One
No log 1.0 50 2.0918 -0.8602 nan 0.0139 1.0
No log 2.0 100 2.0792 -0.8602 nan 0.0139 1.0
No log 3.0 150 1.8791 -0.0618 -0.1346 0.25 0.8472
No log 4.0 200 2.0894 -0.2303 nan 0.375 0.8194
No log 5.0 250 2.4916 -0.2303 nan 0.375 0.8194
No log 6.0 300 2.6486 -0.2303 nan 0.375 0.8194
No log 7.0 350 2.9221 -0.2303 nan 0.375 0.8194
No log 8.0 400 2.9376 -0.2303 nan 0.375 0.8194
No log 9.0 450 3.0675 -0.2303 nan 0.375 0.8194
1.0223 10.0 500 3.0166 -0.2303 nan 0.375 0.8194
1.0223 11.0 550 3.3407 -0.2303 nan 0.375 0.8194
1.0223 12.0 600 3.4897 0.1396 0.2611 0.375 0.8472
1.0223 13.0 650 3.8130 0.1396 0.2611 0.375 0.8472
1.0223 14.0 700 4.0549 0.1396 0.2611 0.375 0.8472
1.0223 15.0 750 4.3384 -0.2303 nan 0.375 0.8194
1.0223 16.0 800 4.5561 -0.2303 nan 0.375 0.8194
1.0223 17.0 850 4.4345 -0.2303 nan 0.375 0.8194
1.0223 18.0 900 4.8191 -0.1764 -0.0680 0.3611 0.8194
1.0223 19.0 950 4.8287 -0.2303 nan 0.375 0.8194
0.2678 20.0 1000 4.6614 -0.1528 -0.0976 0.3472 0.8194

Framework versions

  • Transformers 4.26.0
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.12.1
Downloads last month
0