Text2Text Generation
Transformers
PyTorch
English
Kinyarwanda
m2m_100
Inference Endpoints

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Model Details

Model Description

This is a Machine Translation model, finetuned from NLLB-200's distilled 1.3B model, it is meant to be used in machine translation for tourism-related data.

  • Finetuning code repository: the code used to finetune this model can be found here

How to Get Started with the Model

Use the code below to get started with the model.

Training Procedure

The model was finetuned on three datasets; a general purpose dataset, a tourism, and an education dataset.

The model was finetuned in two phases.

Phase one:

  • General purpose dataset
  • Education dataset
  • Tourism dataset

Phase two:

  • Tourism dataset

Other than the dataset changes between phase one, and phase two finetuning; no other hyperparameters were modified. In both cases, the model was trained on an A100 40GB GPU for two epochs.

Evaluation

Metrics

Model performance was measured using BLEU, spBLEU, TER, and chrF++ metrics.

Results

Lang. Direction BLEU spBLEU chrf++ TER
Eng -> Kin 28.37 40.62 56.48 59.71
Kin -> Eng 42.54 44.84 61.54 43.87
Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train mbazaNLP/Nllb_finetuned_tourism_en_kin

Spaces using mbazaNLP/Nllb_finetuned_tourism_en_kin 4