sakrah commited on
Commit
4830455
1 Parent(s): 51534d7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -7
README.md CHANGED
@@ -4,22 +4,23 @@ license: mit
4
  ---
5
  ## TwiBERT
6
  ## Model Description
7
- TwiBERT is a language model pretrained on the Twi language, the most spoken language in Ghana, West Africa.
8
- The model has 61 million parameters, 6 attention heads, 768 hidden units and 3072 feed forward size. The model
9
- was trained on the Asanti Twi Bible together with a crowdsourced dataset.
 
10
 
11
 
12
 
13
  ## Limitations:
14
 
15
- The model was trained on a very small dataset (about 5MB), which makes it difficult for the model
16
- to learn complex contextual embeddings that will enable it to generalize. Plus, the scope of the dataset (the bible) might
17
- give it strong religious bias.
18
 
19
 
20
  ## How to use it
21
 
22
- You can finetune TwiBERT by finetuning it on a downtream task.
23
  The example code below illustrates how you can use the TwiBERT model on a downtream task:
24
 
25
  ```python
 
4
  ---
5
  ## TwiBERT
6
  ## Model Description
7
+ TwiBERT is a pre-trained language model specifically designed for the Twi language, which is widely spoken in Ghana,
8
+ West Africa. This model boasts 61 million parameters, 6 attention heads, 768 hidden units, and a feed-forward size of 3072.
9
+ To optimize its performance, TwiBERT was trained using a combination of the Asanti Twi Bible and a dataset
10
+ sourced through crowdsourcing efforts.
11
 
12
 
13
 
14
  ## Limitations:
15
 
16
+ The model was trained on a relatively limited dataset (approximately 5MB),
17
+ which may hinder its ability to learn intricate contextual embeddings and effectively generalize.
18
+ Additionally, the dataset's focus on the Bible could potentially introduce a strong religious bias in the model's output.
19
 
20
 
21
  ## How to use it
22
 
23
+ You can use TwiBERT by finetuning it on a downtream task.
24
  The example code below illustrates how you can use the TwiBERT model on a downtream task:
25
 
26
  ```python