deberta-v3-large-survey-new_fact_main_passage-rater-all-gpt4
This model is a fine-tuned version of microsoft/deberta-v3-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7963
- Krippendorff: 0.7388
- Spearman: 0.7347
- Absolute Agreement: 0.7961
- Agreement Within One: 0.9401
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Krippendorff | Spearman | Absolute Agreement | Agreement Within One |
---|---|---|---|---|---|---|---|
No log | 1.0 | 55 | 1.8840 | -0.1735 | 0.1672 | 0.3194 | 0.5694 |
No log | 2.0 | 110 | 1.8714 | 0.3325 | 0.2924 | 0.3472 | 0.8333 |
No log | 3.0 | 165 | 1.8455 | 0.3427 | 0.4616 | 0.375 | 0.9722 |
No log | 4.0 | 220 | 1.8345 | -0.3980 | 0.1275 | 0.2361 | 1.0 |
No log | 5.0 | 275 | 1.8686 | 0.5029 | 0.5400 | 0.4306 | 0.9306 |
No log | 6.0 | 330 | 1.6423 | 0.8650 | 0.8167 | 0.5278 | 0.8889 |
No log | 7.0 | 385 | 1.6656 | 0.8650 | 0.8167 | 0.5278 | 0.8889 |
No log | 8.0 | 440 | 1.6436 | 0.8626 | 0.8162 | 0.5278 | 0.875 |
No log | 9.0 | 495 | 1.5656 | 0.8626 | 0.8162 | 0.5278 | 0.875 |
1.2212 | 10.0 | 550 | 1.5328 | 0.7676 | 0.7513 | 0.5139 | 0.8611 |
1.2212 | 11.0 | 605 | 1.5906 | 0.8626 | 0.8162 | 0.5278 | 0.875 |
1.2212 | 12.0 | 660 | 1.3500 | 0.7795 | 0.7681 | 0.5278 | 0.8611 |
1.2212 | 13.0 | 715 | 1.1828 | 0.8324 | 0.7641 | 0.5972 | 0.875 |
1.2212 | 14.0 | 770 | 1.3495 | 0.7277 | 0.7280 | 0.5694 | 0.9028 |
1.2212 | 15.0 | 825 | 1.2211 | 0.8510 | 0.7811 | 0.6111 | 0.8889 |
1.2212 | 16.0 | 880 | 1.1597 | 0.8487 | 0.7714 | 0.625 | 0.875 |
1.2212 | 17.0 | 935 | 1.3074 | 0.8454 | 0.8281 | 0.625 | 0.875 |
1.2212 | 18.0 | 990 | 1.1712 | 0.8548 | 0.8057 | 0.625 | 0.875 |
0.4567 | 19.0 | 1045 | 1.0816 | 0.8484 | 0.8103 | 0.6389 | 0.875 |
0.4567 | 20.0 | 1100 | 1.0759 | 0.8537 | 0.8038 | 0.6111 | 0.8889 |
Framework versions
- Transformers 4.26.0
- Pytorch 1.13.1
- Datasets 2.10.1
- Tokenizers 0.12.1
- Downloads last month
- 0