t5-small-finetuned-tifu
This model is a fine-tuned version of hientptran/t5-small-finetuned-tifu on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.9160
- Rouge1: 21.4234
- Rouge2: 5.1697
- Rougel: 17.8628
- Rougelsum: 18.3012
- Gen Len: 19.4414
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
3.0712 | 1.0 | 2107 | 2.9372 | 20.9834 | 4.9652 | 17.5404 | 17.942 | 19.448 |
3.0523 | 2.0 | 4214 | 2.9216 | 21.3716 | 5.1541 | 17.7939 | 18.2228 | 19.4411 |
3.0546 | 3.0 | 6321 | 2.9160 | 21.4234 | 5.1697 | 17.8628 | 18.3012 | 19.4414 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 37
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for hientptran/t5-small-finetuned-tifu
Unable to build the model tree, the base model loops to the model itself. Learn more.