gpt2-swahili / README.md
alokmatta's picture
Update README.md
8ca5ef6
metadata
language: sw
widget:
  - text: Ninitaka kukula
datasets:
  - flax-community/swahili-safi

GPT2 in Swahili

This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.

How to use

from transformers import AutoTokenizer, AutoModelWithLMHead
  
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt2-swahili")

model = AutoModelWithLMHead.from_pretrained("flax-community/gpt2-swahili")

print(round((model.num_parameters())/(1000*1000)),"Million Parameters")

124 Million Parameters

Training Data:

This model was trained on Swahili Safi

More Details:

For more details and Demo please check HF Swahili Space