Edit model card

deberta-v3-large-survey-topicality-rater-gpt4

This model is a fine-tuned version of microsoft/deberta-v3-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0013
  • Krippendorff: -0.0825
  • Spearman: 0.0965
  • Absolute Agreement: 0.7324
  • Agreement Within One: 0.7663

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Krippendorff Spearman Absolute Agreement Agreement Within One
No log 1.0 50 2.0331 -0.7817 nan 0.0417 1.0
No log 2.0 100 2.0130 -0.6164 -0.0607 0.0694 0.9722
No log 3.0 150 1.9945 -0.5886 -0.1318 0.0694 0.9583
No log 4.0 200 1.9436 0.0270 0.1344 0.375 0.8194
No log 5.0 250 2.0502 -0.2309 nan 0.3889 0.7917
No log 6.0 300 1.9935 -0.1983 -0.0674 0.375 0.7917
No log 7.0 350 2.1089 -0.2309 nan 0.3889 0.7917
No log 8.0 400 2.0967 -0.2309 nan 0.3889 0.7917
No log 9.0 450 2.0218 -0.2309 nan 0.3889 0.7917
1.183 10.0 500 2.4122 -0.2309 nan 0.3889 0.7917
1.183 11.0 550 2.1673 -0.2358 -0.1152 0.2361 0.8194
1.183 12.0 600 2.3777 -0.1719 -0.1368 0.2778 0.8056
1.183 13.0 650 2.7792 -0.0861 0.1326 0.3889 0.8056
1.183 14.0 700 2.5054 -0.1648 -0.0297 0.2778 0.8194
1.183 15.0 750 3.0246 -0.0957 -0.0114 0.3611 0.7917
1.183 16.0 800 2.9566 -0.0535 0.0724 0.375 0.8056
1.183 17.0 850 3.1392 -0.0757 0.0155 0.3472 0.8056
1.183 18.0 900 3.0966 -0.1148 -0.0379 0.3194 0.8056
1.183 19.0 950 3.0304 -0.1374 -0.0884 0.3056 0.8056
0.6319 20.0 1000 2.7780 0.0069 0.1298 0.3611 0.8194

Framework versions

  • Transformers 4.26.0
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.12.1
Downloads last month
7