metadata
language: sw
widget:
- text: Ninitaka kukula
datasets:
- flax-community/swahili-safi
GPT2 in Swahili
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
How to use
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt2-swahili")
model = AutoModelWithLMHead.from_pretrained("flax-community/gpt2-swahili")
Training Data:
This model was trained on Swahili Safi
More Details:
For more details and Demo please check HF Swahili Space