theyorubayesian commited on
Commit
6ded7d8
1 Parent(s): d34a5b4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -27,7 +27,7 @@ language:
27
 
28
  # AfriTeVa V2 Large
29
 
30
- AfriTeVa V2 Large is a multilingual T5 [Version 1.1](https://github.com/google-research/text-to-text-transfer-transformer/blob/main/released_checkpoints.md#t511) model pretrained on [Wura](https://huggingface.co/datasets/castorini/wura) with a vocabulary size of 150,000. The model has been shown to improve over existing baselines on [Text Classification](https://huggingface.co/datasets/masakhane/masakhanews), [Machine Translation](https://huggingface.co/datasets/masakhane/mafand), [Summarization](https://huggingface.co/datasets/csebuetnlp/xlsum) and [Cross-lingual Question Answering](https://huggingface.co/datasets/masakhane/afriqa). The model has 1B parameters.
31
 
32
  Paper: [Better Quality Pretraining Data & T5 Models for African Languages](https://openreview.net/forum?id=ybc9V6Cbq2)
33
 
 
27
 
28
  # AfriTeVa V2 Large
29
 
30
+ AfriTeVa V2 Large is a multilingual T5 [Version 1.1](https://github.com/google-research/text-to-text-transfer-transformer/blob/main/released_checkpoints.md#t511) model pretrained on [Wura](https://huggingface.co/datasets/castorini/wura) with a vocabulary size of 150,000. The model has 1B parameters.
31
 
32
  Paper: [Better Quality Pretraining Data & T5 Models for African Languages](https://openreview.net/forum?id=ybc9V6Cbq2)
33