jcblaise commited on
Commit
0de442d
1 Parent(s): 9985e0e

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -0
README.md ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: tl
3
+ tags:
4
+ - roberta
5
+ - tagalog
6
+ - filipino
7
+ license: cc-by-sa-4.0
8
+ inference: false
9
+ ---
10
+
11
+ # RoBERTa Tagalog Large
12
+ Tagalog RoBERTa trained as an improvement over our previous Tagalog pretrained Transformers. Trained with TLUnified, a newer, larger, more topically-varied pretraining corpus for Filipino. This model is part of a larger research project. We open-source the model to allow greater usage within the Filipino NLP community.
13
+
14
+ This model is a cased model. We do not release uncased RoBERTa models.
15
+
16
+ ## Citations
17
+ All model details and training setups can be found in our papers. If you use our model or find it useful in your projects, please cite our work:
18
+
19
+ ```
20
+ @article{cruz2021improving,
21
+ title={Improving Large-scale Language Models and Resources for Filipino},
22
+ author={Jan Christian Blaise Cruz and Charibeth Cheng},
23
+ journal={arXiv preprint arXiv:2111.06053},
24
+ year={2021}
25
+ }
26
+ ```
27
+
28
+ ## Data and Other Resources
29
+ Data used to train this model as well as other benchmark datasets in Filipino can be found in my website at https://blaisecruz.com
30
+
31
+ ## Contact
32
+ If you have questions, concerns, or if you just want to chat about NLP and low-resource languages in general, you may reach me through my work email at me@blaisecruz.com