Edit model card

mt5-small-finetuned-amazon-en-de

This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6824
  • Rouge1: 16.5186
  • Rouge2: 10.0545
  • Rougel: 16.2944
  • Rougelsum: 16.2835

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
8.4221 1.0 651 3.1302 13.8778 6.1808 13.6862 13.6699
4.1085 2.0 1302 2.8969 13.7773 7.1463 13.7471 13.7448
3.7329 3.0 1953 2.8285 13.3819 6.5587 13.3349 13.1454
3.5489 4.0 2604 2.7547 16.886 9.8816 16.8247 16.8231
3.4223 5.0 3255 2.7334 16.6755 10.0955 16.5465 16.5025
3.3509 6.0 3906 2.6994 16.851 10.5061 16.6289 16.7191
3.2895 7.0 4557 2.6871 16.4401 10.0994 16.2156 16.224
3.281 8.0 5208 2.6824 16.5186 10.0545 16.2944 16.2835

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
0
Safetensors
Model size
300M params
Tensor type
F32
·

Finetuned from