IndoT5 Machine Translation
Collection
Fine-tuned mT5 models on languages of Indonesia. Based on Many-to-Many Multilingual Translation Model for Languages of Indonesia (Wongso et al., 2023)
•
4 items
•
Updated
This model is a fine-tuned version of google/mt5-base on the lazarus-project/alkitab-sabda-mt dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
---|---|---|---|---|---|
1.5774 | 7.64 | 1000 | 1.5461 | 3.3312 | 18.8762 |
1.137 | 15.28 | 2000 | 1.4426 | 3.8148 | 18.8755 |
0.9109 | 22.92 | 3000 | 1.4754 | 3.9571 | 18.8752 |
0.7807 | 30.56 | 4000 | 1.5373 | 3.9767 | 18.8761 |
0.7288 | 38.2 | 5000 | 1.5657 | 3.9838 | 18.8778 |