Update README.md
Browse files
README.md
CHANGED
@@ -8,9 +8,11 @@ license: cc-by-sa-4.0
|
|
8 |
inference: false
|
9 |
---
|
10 |
|
11 |
-
# RoBERTa Tagalog Base
|
12 |
Tagalog RoBERTa trained as an improvement over our previous Tagalog pretrained Transformers. Trained with TLUnified, a newer, larger, more topically-varied pretraining corpus for Filipino. This model is part of a larger research project. We open-source the model to allow greater usage within the Filipino NLP community.
|
13 |
|
|
|
|
|
14 |
## Citations
|
15 |
All model details and training setups can be found in our papers. If you use our model or find it useful in your projects, please cite our work:
|
16 |
|
|
|
8 |
inference: false
|
9 |
---
|
10 |
|
11 |
+
# RoBERTa Tagalog Base
|
12 |
Tagalog RoBERTa trained as an improvement over our previous Tagalog pretrained Transformers. Trained with TLUnified, a newer, larger, more topically-varied pretraining corpus for Filipino. This model is part of a larger research project. We open-source the model to allow greater usage within the Filipino NLP community.
|
13 |
|
14 |
+
This model is a cased model. We do not release uncased RoBERTa models.
|
15 |
+
|
16 |
## Citations
|
17 |
All model details and training setups can be found in our papers. If you use our model or find it useful in your projects, please cite our work:
|
18 |
|