mt5-small-finetuned-amazon-en-es
This model is a fine-tuned version of google/mt5-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.0183
- Rouge1: 16.8462
- Rouge2: 7.9926
- Rougel: 16.7138
- Rougelsum: 16.7353
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
3.6757 | 1.0 | 1209 | 3.2206 | 17.7063 | 9.0094 | 17.1467 | 17.1168 |
3.6404 | 2.0 | 2418 | 3.0712 | 16.3983 | 7.5945 | 16.0944 | 15.9995 |
3.4255 | 3.0 | 3627 | 3.0459 | 17.8576 | 9.1951 | 17.4539 | 17.4929 |
3.3127 | 4.0 | 4836 | 3.0397 | 16.9239 | 7.8104 | 16.6155 | 16.585 |
3.2325 | 5.0 | 6045 | 3.0406 | 16.8228 | 8.1594 | 16.6843 | 16.6554 |
3.1674 | 6.0 | 7254 | 3.0252 | 17.1619 | 8.4048 | 17.0407 | 17.0515 |
3.1355 | 7.0 | 8463 | 3.0226 | 17.164 | 8.2978 | 17.0445 | 17.1064 |
3.1118 | 8.0 | 9672 | 3.0183 | 16.8462 | 7.9926 | 16.7138 | 16.7353 |
Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
- Downloads last month
- 2
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.