t5-paraphrase
This model is a fine-tuned version of Vamsi/T5_Paraphrase_Paws on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0455
- Rouge2 Precision: 0.5933
- Rouge2 Recall: 0.36
- Rouge2 Fmeasure: 0.4202
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge2 Precision | Rouge2 Recall | Rouge2 Fmeasure |
---|---|---|---|---|---|---|
No log | 1.0 | 34 | 0.0918 | 0.5469 | 0.324 | 0.3811 |
No log | 2.0 | 68 | 0.0700 | 0.5567 | 0.3245 | 0.3836 |
No log | 3.0 | 102 | 0.0616 | 0.5744 | 0.3407 | 0.3986 |
No log | 4.0 | 136 | 0.0564 | 0.5677 | 0.3334 | 0.3935 |
No log | 5.0 | 170 | 0.0518 | 0.5774 | 0.3389 | 0.4004 |
No log | 6.0 | 204 | 0.0499 | 0.583 | 0.3427 | 0.4042 |
No log | 7.0 | 238 | 0.0481 | 0.59 | 0.3457 | 0.4082 |
No log | 8.0 | 272 | 0.0466 | 0.5653 | 0.3316 | 0.3916 |
No log | 9.0 | 306 | 0.0451 | 0.5901 | 0.3511 | 0.4123 |
No log | 10.0 | 340 | 0.0449 | 0.6079 | 0.3572 | 0.423 |
No log | 11.0 | 374 | 0.0451 | 0.6139 | 0.3634 | 0.4273 |
No log | 12.0 | 408 | 0.0446 | 0.5903 | 0.3574 | 0.4134 |
No log | 13.0 | 442 | 0.0443 | 0.6077 | 0.3587 | 0.4218 |
No log | 14.0 | 476 | 0.0439 | 0.5921 | 0.3657 | 0.4235 |
0.1621 | 15.0 | 510 | 0.0442 | 0.6135 | 0.3758 | 0.4395 |
0.1621 | 16.0 | 544 | 0.0449 | 0.5744 | 0.3473 | 0.4054 |
0.1621 | 17.0 | 578 | 0.0447 | 0.5623 | 0.3324 | 0.3917 |
0.1621 | 18.0 | 612 | 0.0449 | 0.5877 | 0.3569 | 0.4165 |
0.1621 | 19.0 | 646 | 0.0452 | 0.5845 | 0.3542 | 0.4138 |
0.1621 | 20.0 | 680 | 0.0452 | 0.5909 | 0.3577 | 0.4189 |
0.1621 | 21.0 | 714 | 0.0452 | 0.5907 | 0.3567 | 0.4179 |
0.1621 | 22.0 | 748 | 0.0453 | 0.5909 | 0.3577 | 0.4189 |
0.1621 | 23.0 | 782 | 0.0453 | 0.6002 | 0.367 | 0.427 |
0.1621 | 24.0 | 816 | 0.0453 | 0.5958 | 0.3642 | 0.4242 |
0.1621 | 25.0 | 850 | 0.0452 | 0.5915 | 0.3582 | 0.4182 |
0.1621 | 26.0 | 884 | 0.0451 | 0.5933 | 0.36 | 0.4202 |
0.1621 | 27.0 | 918 | 0.0454 | 0.5985 | 0.3625 | 0.4238 |
0.1621 | 28.0 | 952 | 0.0453 | 0.5941 | 0.3608 | 0.4211 |
0.1621 | 29.0 | 986 | 0.0454 | 0.5933 | 0.36 | 0.4202 |
0.011 | 30.0 | 1020 | 0.0455 | 0.5933 | 0.36 | 0.4202 |
Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.7.1
- Tokenizers 0.13.2
- Downloads last month
- 6