Edit model card

gendered

This model is a fine-tuned version of samzirbo/mT5.en-es.pretrained on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1747
  • Bleu: 43.5153
  • Meteor: 0.6866
  • Chrf++: 62.4637

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 1000
  • training_steps: 50000

Training results

Training Loss Epoch Step Validation Loss Bleu Meteor Chrf++
4.5293 0.26 2500 2.0219 28.0137 0.5556 49.0872
2.4187 0.53 5000 1.7260 33.4178 0.6039 54.238
2.1694 0.79 7500 1.5868 35.7611 0.6256 56.142
2.0239 1.05 10000 1.4968 37.4453 0.6394 57.6176
1.9076 1.32 12500 1.4333 38.2643 0.6473 58.2946
1.8514 1.58 15000 1.3902 39.5846 0.6554 59.3566
1.7986 1.84 17500 1.3454 39.9905 0.6612 59.701
1.7351 2.11 20000 1.3096 40.8823 0.6664 60.3625
1.6834 2.37 22500 1.2872 41.4499 0.6711 60.837
1.6589 2.64 25000 1.2591 42.0603 0.6743 61.2625
1.6353 2.9 27500 1.2353 42.3878 0.6798 61.5963
1.5929 3.16 30000 1.2242 42.7944 0.6794 61.8097
1.5635 3.43 32500 1.2094 43.0109 0.6822 61.9923
1.5537 3.69 35000 1.2011 43.1533 0.6839 62.1374
1.5449 3.95 37500 1.1873 43.2691 0.6844 62.1844
1.5131 4.22 40000 1.1830 43.3538 0.6868 62.3435
1.5067 4.48 42500 1.1790 43.4627 0.6861 62.4529
1.502 4.74 45000 1.1761 43.4613 0.686 62.3993
1.4976 5.01 47500 1.1749 43.6051 0.6866 62.4784
1.4889 5.27 50000 1.1747 43.5153 0.6866 62.4637

Framework versions

  • Transformers 4.38.0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month
74
Safetensors
Model size
60.4M params
Tensor type
F32
·

Finetuned from

Collection including samzirbo/gendered