Edit model card

t5-small_fr-finetuned-en-to-it

This model is a fine-tuned version of din0s/t5-small-finetuned-en-to-fr on the ccmatrix dataset. It achieves the following results on the evaluation set:

  • Loss: 2.3225
  • Bleu: 7.4222
  • Gen Len: 59.1127

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 96
  • eval_batch_size: 96
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 94 3.0406 3.2546 52.6127
No log 2.0 188 2.9278 3.1206 62.774
No log 3.0 282 2.8573 3.4206 63.6707
No log 4.0 376 2.8030 3.4847 66.408
No log 5.0 470 2.7602 3.8933 64.362
3.2982 6.0 564 2.7185 3.9298 66.058
3.2982 7.0 658 2.6842 4.0344 65.5773
3.2982 8.0 752 2.6536 4.3243 65.0047
3.2982 9.0 846 2.6233 4.5078 64.5813
3.2982 10.0 940 2.5966 4.6657 63.654
2.9837 11.0 1034 2.5743 4.7664 63.326
2.9837 12.0 1128 2.5526 4.9535 62.7327
2.9837 13.0 1222 2.5303 5.1386 63.5887
2.9837 14.0 1316 2.5122 5.1037 64.1667
2.9837 15.0 1410 2.4937 5.3304 63.116
2.8416 16.0 1504 2.4797 5.5006 61.4953
2.8416 17.0 1598 2.4627 5.5892 62.01
2.8416 18.0 1692 2.4497 5.8497 61.42
2.8416 19.0 1786 2.4372 6.0074 61.1587
2.8416 20.0 1880 2.4256 6.1464 60.522
2.8416 21.0 1974 2.4148 6.3117 59.5567
2.7428 22.0 2068 2.4039 6.4626 59.532
2.7428 23.0 2162 2.3939 6.5287 60.2307
2.7428 24.0 2256 2.3857 6.6093 60.22
2.7428 25.0 2350 2.3772 6.8004 59.396
2.7428 26.0 2444 2.3703 6.9433 59.5027
2.6779 27.0 2538 2.3631 7.0153 59.1433
2.6779 28.0 2632 2.3575 7.1783 58.9793
2.6779 29.0 2726 2.3514 7.1639 59.362
2.6779 30.0 2820 2.3457 7.2176 58.9927
2.6779 31.0 2914 2.3411 7.2599 59.1433
2.6335 32.0 3008 2.3374 7.284 59.1787
2.6335 33.0 3102 2.3339 7.3678 59.07
2.6335 34.0 3196 2.3307 7.3364 58.9813
2.6335 35.0 3290 2.3281 7.3318 58.96
2.6335 36.0 3384 2.3259 7.394 59.0787
2.6335 37.0 3478 2.3245 7.4133 59.0393
2.609 38.0 3572 2.3232 7.383 59.1887
2.609 39.0 3666 2.3227 7.4105 59.1227
2.609 40.0 3760 2.3225 7.4222 59.1127

Framework versions

  • Transformers 4.22.1
  • Pytorch 1.12.1
  • Datasets 2.5.1
  • Tokenizers 0.11.0
Downloads last month
9

Evaluation results