File size: 1,128 Bytes
6611c0e
 
 
 
bcbd952
 
6611c0e
 
 
2d8c32d
36bcfa2
 
 
415bba1
349fd93
36bcfa2
 
 
 
 
 
 
 
 
2d8c32d
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
---
language: sw
widget:
- text: "Idris ameandika kwenye ukurasa wake wa Instagram akimkumbusha Diamond kutekeleza ahadi yake kumpigia Zari magoti kumuomba msamaha kama alivyowahi kueleza awali.Idris ameandika;"
datasets:
- flax-community/swahili-safi
---

## Swahili News Classification with RoBERTa


This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.

This [model](https://huggingface.co/flax-community/roberta-swahili) was used as the base and fine-tuned for this task.

## How to use

```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
  
tokenizer = AutoTokenizer.from_pretrained("flax-community/roberta-swahili-news-classification")

model = AutoModelForSequenceClassification.from_pretrained("flax-community/roberta-swahili-news-classification")
```

```
Eval metrics: {'accuracy': 0.9153416415986249}
```