RoBERTa in Swahili

This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.

How to use

from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("flax-community/roberta-swahili")

model = AutoModelForMaskedLM.from_pretrained("flax-community/roberta-swahili")

print(round((model.num_parameters())/(1000*1000)),"Million Parameters")

105 Million Parameters

Training Data:

This model was trained on Swahili Safi


MasakhaNER Open In Colab

Eval metrics: {'f1': 86%}

This model was fine-tuned based off this model for the Zindi News Classification Challenge

More Details:

For more details and Demo please check HF Swahili Space

Downloads last month
Hosted inference API
Mask token: <mask>
This model can be loaded on the Inference API on-demand.

Dataset used to train flax-community/roberta-swahili