Amharic GPT2
Collection
GPT2 transformer decoder models pretrained on 290 million tokens of Amharic text
•
6 items
•
Updated
This is a smaller version of the gpt2 decoder transformer model pretrained from scratch for 1.5 days on 290 million tokens of Amharic text.
It achieves the following results on the evaluation set:
Loss: 3.59
Perplexity: 36.23
You can use the following demo to generate text using gpt2-small-amharic. Please enter a prompt and click the Generate button to generate completions for the prompt.