Model card error
There’s an error in the yaml metadata in this model card. If you’re the model author, please log in to check the list of errors and warnings.
bert-base-multilingual-cased-finetuned-igbo is a Igbo BERT model obtained by fine-tuning bert-base-multilingual-cased model on Igbo language texts. It provides better performance than the multilingual BERT on text classification and named entity recognition datasets.
Specifically, this model is a bert-base-multilingual-cased model that was fine-tuned on Igbo corpus.
You can use this model with Transformers pipeline for masked token prediction.
from transformers import pipeline unmasker = pipeline('fill-mask', model='Davlan/bert-base-multilingual-cased-finetuned-igbo') unmasker("Reno Omokri na Gọọmentị [MASK] enweghị ihe ha ga-eji hiwe ya bụ mmachi.")
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.
This model was trained on a single NVIDIA V100 GPU
|Dataset||mBERT F1||ig_bert F1|
By David Adelani
- Downloads last month