Edit model card

mt5-small-finetuned-amazon-en-es

This model is a fine-tuned version of google/mt5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0296
  • Rouge1: 18.0335
  • Rouge2: 8.816
  • Rougel: 17.5279
  • Rougelsum: 17.6189

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
6.9312 1.0 1209 3.2984 14.4268 6.4451 14.0547 14.1363
3.8882 2.0 2418 3.1272 17.1618 8.7776 16.4569 16.5079
3.578 3.0 3627 3.0798 17.9251 9.2806 17.4056 17.3871
3.4191 4.0 4836 3.0671 17.6256 8.8731 16.975 17.0113
3.3193 5.0 6045 3.0605 17.9539 8.7188 17.4034 17.4726
3.2434 6.0 7254 3.0387 17.0668 8.2769 16.5612 16.6636
3.208 7.0 8463 3.0338 17.2954 8.4547 16.7602 16.8175
3.1812 8.0 9672 3.0296 18.0335 8.816 17.5279 17.6189

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.0+cu116
  • Datasets 2.7.1
  • Tokenizers 0.13.2
Downloads last month
12