Edit model card

M2M100 418M

M2M100 is a multilingual encoder-decoder transformer model trained for Many-to-Many multilingual translation. The model, originally introduced by researchers at Facebook, demonstrates impressive performance in cross-lingual translation tasks. For a better understanding of M2M100 you can look into the paper and the associated repository.
To further enhance the capabilities of M2M100, we conducted finetuning experiments on English-to-Arabic parallel text. The finetuning process involved training the model for 1000K steps using a batch size of 8.

Downloads last month
2

Datasets used to train khalidalt/m2m100_418M-finetuned-en-to-ar