Edit model card

mt5-small-finetuned-amazon-en-es

This model is a fine-tuned version of google/mt5-small on the cnn_dailymail dataset. It achieves the following results on the evaluation set:

  • Loss: 2.4413
  • Rouge1: 22.6804
  • Rouge2: 8.3299
  • Rougel: 17.9992
  • Rougelsum: 20.7342

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
7.77 1.0 240 2.7230 17.25 5.629 14.0381 15.8959
3.7586 2.0 480 2.5949 19.4577 6.9354 15.772 17.8773
3.4314 3.0 720 2.5355 20.0511 7.6417 16.0889 18.4551
3.2892 4.0 960 2.4845 20.3951 7.88 16.601 19.0048
3.1954 5.0 1200 2.4612 20.1806 7.2656 16.2658 18.6222
3.1128 6.0 1440 2.4544 22.5647 8.0899 17.8057 20.487
3.103 7.0 1680 2.4498 22.7048 8.384 17.978 20.6871
3.0708 8.0 1920 2.4413 22.6804 8.3299 17.9992 20.7342

Framework versions

  • Transformers 4.20.1
  • Pytorch 1.12.0+cu113
  • Datasets 2.3.2
  • Tokenizers 0.12.1
Downloads last month
3

Dataset used to train shivaniNK8/mt5-small-finetuned-amazon-en-es

Evaluation results