Update README.md
Browse files
README.md
CHANGED
@@ -8,6 +8,19 @@ datasets:
|
|
8 |
|
9 |
## Swahili News Classification with RoBERTa
|
10 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
|
12 |
```
|
13 |
Eval metrics: {'accuracy': 0.9153416415986249}
|
|
|
8 |
|
9 |
## Swahili News Classification with RoBERTa
|
10 |
|
11 |
+
## GPT2 in Swahili
|
12 |
+
|
13 |
+
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
|
14 |
+
|
15 |
+
## How to use
|
16 |
+
|
17 |
+
```python
|
18 |
+
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
19 |
+
|
20 |
+
tokenizer = AutoTokenizer.from_pretrained("flax-community/roberta-swahili-news-classification")
|
21 |
+
|
22 |
+
model = AutoModelForSequenceClassification.from_pretrained("flax-community/roberta-swahili-news-classification")
|
23 |
+
```
|
24 |
|
25 |
```
|
26 |
Eval metrics: {'accuracy': 0.9153416415986249}
|