Edit model card

mt5-small-finetuned-amazon-en-es

This model is a fine-tuned version of google/mt5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0255
  • Rouge1: 17.5202
  • Rouge2: 8.4634
  • Rougel: 17.0175
  • Rougelsum: 17.0528

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
8.094 1.0 1209 3.2933 12.7563 5.2606 12.4786 12.4961
3.9263 2.0 2418 3.1487 16.2314 8.4716 15.6854 15.7506
3.599 3.0 3627 3.0789 16.9233 8.1928 16.2596 16.2522
3.429 4.0 4836 3.0492 17.2679 8.7561 16.6685 16.7399
3.3279 5.0 6045 3.0384 17.6081 8.6721 17.0546 17.0368
3.2518 6.0 7254 3.0343 17.2271 8.504 16.6285 16.6209
3.2084 7.0 8463 3.0255 16.7859 8.054 16.2574 16.2853
3.1839 8.0 9672 3.0255 17.5202 8.4634 17.0175 17.0528

Framework versions

  • Transformers 4.16.2
  • Pytorch 1.10.0+cu111
  • Datasets 1.18.3
  • Tokenizers 0.11.6
Downloads last month
4