Edit model card

mt5-small-finetuned-amazon-en-zh_TW

This model is a fine-tuned version of google/mt5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.2408
  • Rouge1: 15.8831
  • Rouge2: 7.1676
  • Rougel: 15.5523
  • Rougelsum: 15.4954

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 7

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
7.5388 1.0 838 3.5888 12.6081 5.3611 12.3495 12.2926
4.0043 2.0 1676 3.4038 13.8517 6.3417 13.4755 13.4913
3.6776 3.0 2514 3.3294 15.1519 7.3842 14.8844 14.8458
3.4929 4.0 3352 3.2668 15.6067 7.4016 15.3715 15.2908
3.387 5.0 4190 3.2855 15.0546 7.3065 14.8271 14.7755
3.302 6.0 5028 3.2457 15.0213 6.6597 14.6131 14.5641
3.2806 7.0 5866 3.2408 15.8831 7.1676 15.5523 15.4954

Framework versions

  • Transformers 4.17.0
  • Pytorch 1.10.0+cu111
  • Datasets 1.18.4
  • Tokenizers 0.11.6
Downloads last month
8