byt5-base-en-zu-mt

Model description

byt5-base-en-zu-mt is a machine translation model from English to isiZulu based on a fine-tuned google/byt5-base model. It establishes a strong baseline for automatically translating texts from English to isiZulu.

Limitations and bias

This model is limited by its training dataset. This may not generalize well for all use cases in different domains.

Training data

Specifically, this model is a byt5-base model that was fine-tuned on JW300 isiZulu corpus and LAFAND. The model was trained using isiXhosa(xh_ZA) as the language since the pre-trained model does not initially support isiZulu. Thus, you need to use the xh_KE for language code when evaluating the model.

Training procedure

This model was trained on NVIDIA V100 GPU

Eval results on Test set (BLEU score)

Fine-tuning byt5-base achieves 13.8 BLEU on LAFAND test set

BibTeX entry and citation info

By David Adelani


Downloads last month
4
Hosted inference API
This model can be loaded on the Inference API on-demand.