mbart-large-50-many-to-many-mmt-finetuned-en-to-hi
This model is a fine-tuned version of facebook/mbart-large-50-many-to-many-mmt on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.7540
- Bleu: 9.5451
- Gen Len: 6.3699
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
---|---|---|---|---|---|
3.2113 | 0.0 | 500 | 2.8535 | 4.975 | 6.0562 |
2.8197 | 0.01 | 1000 | 2.6902 | 3.2399 | 6.2449 |
2.6595 | 0.01 | 1500 | 2.6256 | 1.2488 | 6.5596 |
2.679 | 0.01 | 2000 | 2.5394 | 4.8497 | 6.1644 |
2.5408 | 0.02 | 2500 | 2.4751 | 2.294 | 6.3051 |
2.4689 | 0.02 | 3000 | 2.4052 | 3.6526 | 6.3714 |
2.4299 | 0.02 | 3500 | 2.3642 | 3.4783 | 6.4584 |
2.4303 | 0.02 | 4000 | 2.3545 | 4.9837 | 6.343 |
2.3715 | 0.03 | 4500 | 2.3138 | 5.3665 | 6.1981 |
2.3354 | 0.03 | 5000 | 2.3002 | 5.8659 | 6.109 |
2.3734 | 0.03 | 5500 | 2.2877 | 3.8468 | 6.443 |
2.2968 | 0.04 | 6000 | 2.2385 | 3.1561 | 6.4381 |
2.272 | 0.04 | 6500 | 2.2269 | 5.7378 | 6.1155 |
2.2691 | 0.04 | 7000 | 2.2244 | 7.8318 | 6.2325 |
2.2333 | 0.05 | 7500 | 2.1973 | 5.4549 | 6.1463 |
2.2518 | 0.05 | 8000 | 2.1996 | 2.3346 | 6.4954 |
2.251 | 0.05 | 8500 | 2.1682 | 4.7228 | 6.4063 |
2.252 | 0.06 | 9000 | 2.2211 | 5.7043 | 6.3745 |
2.288 | 0.06 | 9500 | 2.2017 | 3.7285 | 6.4893 |
2.2443 | 0.06 | 10000 | 2.1300 | 7.0869 | 6.4448 |
2.1723 | 0.06 | 10500 | 2.1076 | 4.3789 | 6.3482 |
2.1371 | 0.07 | 11000 | 2.1024 | 4.2024 | 6.547 |
2.1206 | 0.07 | 11500 | 2.1045 | 7.2662 | 6.1786 |
2.1325 | 0.07 | 12000 | 2.0903 | 3.8746 | 6.344 |
2.1315 | 0.08 | 12500 | 2.0817 | 5.9713 | 6.4415 |
2.1285 | 0.08 | 13000 | 2.0637 | 7.7832 | 6.1534 |
2.1223 | 0.08 | 13500 | 2.0522 | 2.9569 | 6.313 |
2.1036 | 0.09 | 14000 | 2.0505 | 5.0732 | 6.3308 |
2.1053 | 0.09 | 14500 | 2.0288 | 6.2772 | 6.1638 |
2.1122 | 0.09 | 15000 | 2.0232 | 6.8311 | 6.2005 |
2.0566 | 0.1 | 15500 | 2.0175 | 5.8541 | 6.1907 |
2.0783 | 0.1 | 16000 | 2.0147 | 10.0926 | 6.1531 |
2.0775 | 0.1 | 16500 | 2.0128 | 7.6705 | 6.2624 |
2.0952 | 0.1 | 17000 | 1.9951 | 5.5904 | 6.2104 |
2.115 | 0.11 | 17500 | 1.9806 | 8.0092 | 6.2081 |
2.0515 | 0.11 | 18000 | 1.9769 | 5.9444 | 6.2055 |
2.0698 | 0.11 | 18500 | 1.9611 | 8.6585 | 6.2591 |
2.0521 | 0.12 | 19000 | 1.9715 | 9.1678 | 6.2758 |
2.0581 | 0.12 | 19500 | 1.9538 | 7.0038 | 6.2019 |
2.0073 | 0.12 | 20000 | 1.9502 | 7.6102 | 6.3093 |
2.0104 | 0.13 | 20500 | 1.9414 | 7.7584 | 6.1554 |
2.0163 | 0.13 | 21000 | 1.9404 | 5.8758 | 6.3561 |
2.03 | 0.13 | 21500 | 1.9294 | 6.9283 | 6.1343 |
1.9915 | 0.14 | 22000 | 1.9159 | 5.6757 | 6.2349 |
2.0158 | 0.14 | 22500 | 1.9234 | 7.5094 | 6.1197 |
1.9616 | 0.14 | 23000 | 1.9170 | 9.4006 | 6.2105 |
1.9954 | 0.14 | 23500 | 1.9008 | 2.9622 | 6.2355 |
2.0116 | 0.15 | 24000 | 1.9026 | 11.0333 | 6.0291 |
1.9742 | 0.15 | 24500 | 1.8973 | 4.504 | 6.3386 |
1.9805 | 0.15 | 25000 | 1.8955 | 3.8655 | 6.2335 |
1.9413 | 0.16 | 25500 | 1.8821 | 8.9818 | 6.1769 |
1.9311 | 0.16 | 26000 | 1.8851 | 6.7291 | 6.0846 |
1.9696 | 0.16 | 26500 | 1.8789 | 12.1041 | 6.1274 |
1.9419 | 0.17 | 27000 | 1.8687 | 7.2389 | 6.2407 |
1.959 | 0.17 | 27500 | 1.8688 | 7.984 | 6.3319 |
1.9449 | 0.17 | 28000 | 1.8655 | 6.7646 | 6.2376 |
1.961 | 0.18 | 28500 | 1.8541 | 9.8683 | 6.2369 |
1.9293 | 0.18 | 29000 | 1.8676 | 8.2689 | 6.0779 |
1.978 | 0.18 | 29500 | 1.8515 | 5.7599 | 6.1964 |
1.9121 | 0.18 | 30000 | 1.8508 | 7.8691 | 6.19 |
1.9566 | 0.19 | 30500 | 1.8350 | 7.7093 | 6.1696 |
1.9279 | 0.19 | 31000 | 1.8455 | 7.2261 | 6.2585 |
1.9717 | 0.19 | 31500 | 1.8374 | 8.7243 | 6.2427 |
1.9215 | 0.2 | 32000 | 1.8239 | 3.5888 | 6.242 |
1.937 | 0.2 | 32500 | 1.8352 | 3.352 | 6.2681 |
1.9103 | 0.2 | 33000 | 1.8260 | 7.5665 | 6.2443 |
1.9363 | 0.21 | 33500 | 1.8122 | 8.6132 | 6.1723 |
1.8938 | 0.21 | 34000 | 1.8175 | 8.49 | 6.3157 |
1.8869 | 0.21 | 34500 | 1.8156 | 7.8958 | 6.3069 |
1.9113 | 0.22 | 35000 | 1.8083 | 5.658 | 6.2682 |
1.9175 | 0.22 | 35500 | 1.8012 | 9.5439 | 6.3022 |
1.9283 | 0.22 | 36000 | 1.8032 | 9.1064 | 6.3756 |
1.9227 | 0.22 | 36500 | 1.7899 | 8.7293 | 6.2953 |
1.9129 | 0.23 | 37000 | 1.7822 | 4.2586 | 6.2276 |
1.8733 | 0.23 | 37500 | 1.7789 | 9.1095 | 6.2261 |
1.8986 | 0.23 | 38000 | 1.7831 | 4.853 | 6.2544 |
1.8655 | 0.24 | 38500 | 1.7762 | 7.4264 | 6.3151 |
1.8996 | 0.24 | 39000 | 1.7648 | 7.4422 | 6.3657 |
1.8771 | 0.24 | 39500 | 1.7690 | 12.3696 | 6.1305 |
1.8719 | 0.25 | 40000 | 1.7565 | 10.0389 | 6.2719 |
1.8993 | 0.25 | 40500 | 1.7481 | 14.4891 | 6.2271 |
1.8756 | 0.25 | 41000 | 1.7516 | 7.2393 | 6.2737 |
1.8195 | 0.26 | 41500 | 1.7534 | 9.0375 | 6.2557 |
1.8384 | 0.26 | 42000 | 1.7540 | 9.5451 | 6.3699 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 2