Edit model card
YAML Metadata Error: "datasets" must be one of [string, array]


Model description

bert-base-multilingual-cased-finetuned-kinyarwanda is a Kinyarwanda BERT model obtained by fine-tuning bert-base-multilingual-cased model on Kinyarwanda language texts. It provides better performance than the multilingual BERT on named entity recognition datasets.

Specifically, this model is a bert-base-multilingual-cased model that was fine-tuned on Kinyarwanda corpus.

Intended uses & limitations

How to use

You can use this model with Transformers pipeline for masked token prediction.

>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='Davlan/bert-base-multilingual-cased-finetuned-kinyarwanda')
>>> unmasker("Twabonye ko igihe mu [MASK] hazaba hari ikirango abantu bakunze")

Limitations and bias

This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.

Training data

This model was fine-tuned on JW300 + KIRNEWS + BBC Gahuza

Training procedure

This model was trained on a single NVIDIA V100 GPU

Eval results on Test set (F-score, average over 5 runs)

Dataset mBERT F1 rw_bert F1
MasakhaNER 72.20 77.57

BibTeX entry and citation info

By David Adelani

Downloads last month
Hosted inference API
Mask token: [MASK]
This model can be loaded on the Inference API on-demand.