sakrah commited on
Commit
83babd6
1 Parent(s): 4830455

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -5,7 +5,7 @@ license: mit
5
  ## TwiBERT
6
  ## Model Description
7
  TwiBERT is a pre-trained language model specifically designed for the Twi language, which is widely spoken in Ghana,
8
- West Africa. This model boasts 61 million parameters, 6 attention heads, 768 hidden units, and a feed-forward size of 3072.
9
  To optimize its performance, TwiBERT was trained using a combination of the Asanti Twi Bible and a dataset
10
  sourced through crowdsourcing efforts.
11
 
@@ -25,6 +25,6 @@ The example code below illustrates how you can use the TwiBERT model on a downtr
25
 
26
  ```python
27
  >>> from transformers import AutoTokenizer, AutoModelForTokenClassification
28
- >>> model = AutoModelForTokenClassification.from_pretrained("sakrah/twibert")
29
- >>> tokenizer = AutoTokenizer.from_pretrained("sakrah/twibert")
30
  ```
 
5
  ## TwiBERT
6
  ## Model Description
7
  TwiBERT is a pre-trained language model specifically designed for the Twi language, which is widely spoken in Ghana,
8
+ West Africa. This model has 61 million parameters, 6 layers, 6 attention heads, 768 hidden units, and a feed-forward size of 3072.
9
  To optimize its performance, TwiBERT was trained using a combination of the Asanti Twi Bible and a dataset
10
  sourced through crowdsourcing efforts.
11
 
 
25
 
26
  ```python
27
  >>> from transformers import AutoTokenizer, AutoModelForTokenClassification
28
+ >>> model = AutoModelForTokenClassification.from_pretrained("sakrah/TwiBERT")
29
+ >>> tokenizer = AutoTokenizer.from_pretrained("sakrah/TwiBERT")
30
  ```