Edit model card

deberta-v3-large-survey-main_passage_consistency-rater-all

This model is a fine-tuned version of microsoft/deberta-v3-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2997
  • Krippendorff: 0.8410
  • Spearman: 0.8836
  • Absolute Agreement: 0.9159
  • Agreement Within One: 0.9689

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Krippendorff Spearman Absolute Agreement Agreement Within One
No log 1.0 55 1.8615 -0.1925 nan 0.0833 0.9167
No log 2.0 110 1.8468 -0.1925 nan 0.0833 0.9167
No log 3.0 165 1.6437 -0.2303 nan 0.375 0.8194
No log 4.0 220 1.7843 -0.2303 nan 0.375 0.8194
No log 5.0 275 1.7113 -0.2303 nan 0.375 0.8194
No log 6.0 330 1.7359 -0.2303 nan 0.375 0.8194
No log 7.0 385 1.6476 -0.2303 nan 0.375 0.8194
No log 8.0 440 1.7123 -0.2303 nan 0.375 0.8194
No log 9.0 495 1.4061 0.0832 0.5104 0.5139 0.8611
1.2442 10.0 550 1.3195 0.0876 0.3809 0.5694 0.8889
1.2442 11.0 605 1.3376 0.0832 0.5104 0.5139 0.8611
1.2442 12.0 660 1.3155 0.1081 0.5170 0.5556 0.875
1.2442 13.0 715 1.2785 0.0948 0.4189 0.5833 0.875
1.2442 14.0 770 1.2800 0.2312 0.3578 0.5556 0.9167
1.2442 15.0 825 1.2081 0.2878 0.5758 0.5833 0.8889
1.2442 16.0 880 1.1245 0.2891 0.4589 0.5556 0.9028
1.2442 17.0 935 1.1434 0.2492 0.3892 0.5833 0.8889
1.2442 18.0 990 1.0987 0.4771 0.3821 0.5556 0.9444
0.6135 19.0 1045 1.0792 0.2770 0.3877 0.5694 0.8889
0.6135 20.0 1100 0.9862 0.0948 0.4189 0.5833 0.875

Framework versions

  • Transformers 4.26.0
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.12.1
Downloads last month
14