Text2Text Generation
Transformers
PyTorch
English
Kinyarwanda
m2m_100
Inference Endpoints
Kleber's picture
Create README.md
eef46be
metadata
license: cc-by-2.0
datasets:
  - mbazaNLP/NMT_Tourism_parallel_data_en_kin
  - mbazaNLP/NMT_Education_parallel_data_en_kin
  - mbazaNLP/Kinyarwanda_English_parallel_dataset
language:
  - en
  - rw
library_name: transformers

Model Details

Model Description

This is a Machine Translation model, finetuned from NLLB-200's distilled 1.3B model, it is meant to be used in machine translation for education-related data.

  • Finetuning code repository: the code used to finetune this model can be found here

How to Get Started with the Model

Use the code below to get started with the model.

Training Procedure

The model was finetuned on three datasets; a general purpose dataset, a tourism, and an education dataset. The model was finetuned on an A100 40GB GPU for two epochs.

Evaluation

Testing Data

Metrics

Model performance was measured using BLEU, spBLEU, TER, and chrF++ metrics.

Results