t5-text-simplification_1e4_adafactor
This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4541
- Rouge1: 63.482
- Rouge2: 46.0572
- Rougel: 58.7168
- Rougelsum: 58.691
- Gen Len: 18.1518
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 256
- eval_batch_size: 256
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
0.5806 | 1.0 | 582 | 0.4611 | 63.4316 | 45.9106 | 58.8265 | 58.7225 | 18.2723 |
0.5615 | 2.0 | 1164 | 0.4597 | 63.4805 | 46.0473 | 58.8515 | 58.7755 | 18.2304 |
0.5478 | 3.0 | 1746 | 0.4569 | 63.582 | 46.4341 | 58.9334 | 58.9024 | 18.2251 |
0.5418 | 4.0 | 2328 | 0.4563 | 63.3978 | 46.3386 | 58.7994 | 58.7682 | 18.1937 |
0.5358 | 5.0 | 2910 | 0.4557 | 63.3056 | 46.0741 | 58.9362 | 58.9063 | 18.1675 |
0.5304 | 6.0 | 3492 | 0.4555 | 63.2044 | 45.9513 | 58.6796 | 58.6155 | 18.1414 |
0.5219 | 7.0 | 4074 | 0.4546 | 63.3451 | 46.0504 | 58.7905 | 58.7105 | 18.1937 |
0.5188 | 8.0 | 4656 | 0.4552 | 63.1977 | 46.04 | 58.6804 | 58.6152 | 18.178 |
0.5152 | 9.0 | 5238 | 0.4546 | 63.2055 | 45.926 | 58.5223 | 58.5038 | 18.1885 |
0.5113 | 10.0 | 5820 | 0.4537 | 63.4876 | 46.3268 | 58.8035 | 58.7711 | 18.1937 |
0.5135 | 11.0 | 6402 | 0.4548 | 63.0435 | 45.5796 | 58.4433 | 58.3963 | 18.1414 |
0.5074 | 12.0 | 6984 | 0.4545 | 63.2951 | 45.7406 | 58.5712 | 58.4938 | 18.1518 |
0.5061 | 13.0 | 7566 | 0.4543 | 63.3588 | 46.169 | 58.661 | 58.6234 | 18.1832 |
0.5026 | 14.0 | 8148 | 0.4543 | 63.3304 | 46.0553 | 58.5952 | 58.5745 | 18.1832 |
0.5021 | 15.0 | 8730 | 0.4542 | 63.3378 | 45.8684 | 58.605 | 58.5854 | 18.1518 |
0.5016 | 16.0 | 9312 | 0.4537 | 63.478 | 46.0719 | 58.7172 | 58.6834 | 18.1885 |
0.4995 | 17.0 | 9894 | 0.4538 | 63.5111 | 46.0395 | 58.7451 | 58.7191 | 18.1571 |
0.5009 | 18.0 | 10476 | 0.4539 | 63.5128 | 46.0939 | 58.7491 | 58.7169 | 18.1571 |
0.4982 | 19.0 | 11058 | 0.4541 | 63.4593 | 46.0379 | 58.7067 | 58.6801 | 18.1518 |
0.4977 | 20.0 | 11640 | 0.4541 | 63.482 | 46.0572 | 58.7168 | 58.691 | 18.1518 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
- Downloads last month
- 20
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.