# mbart50-large-yor-eng-mt

## Model description

mbart50-large-yor-eng-mt is a machine translation model from Yorùbá language to English language based on a fine-tuned facebook/mbart-large-50 model. It establishes a strong baseline for automatically translating texts from Yorùbá to English.

Specifically, this model is a mbart-large-50 model that was fine-tuned on JW300 Yorùbá corpus and Menyo-20k. The model was trained using Swahili(sw_KE) as the language since the pre-trained model does not initially support Yorùbá. Thus, you need to use the sw_KE for language code when evaluating the model.

#### Limitations and bias

This model is limited by its training dataset. This may not generalize well for all use cases in different domains.

## Training data

This model was fine-tuned on on JW300 corpus and Menyo-20k dataset

## Training procedure

This model was trained on NVIDIA V100 GPU

## Eval results on Test set (BLEU score)

Fine-tuning mbart50-large achieves 15.88 BLEU on Menyo-20k test set while mt5-base achieves 15.57

### BibTeX entry and citation info