Update README.md
Browse files
README.md
CHANGED
@@ -4,4 +4,4 @@ languages:
|
|
4 |
licenses:
|
5 |
- mit
|
6 |
---
|
7 |
-
This is SinBERT-large model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa.
|
|
|
4 |
licenses:
|
5 |
- mit
|
6 |
---
|
7 |
+
This is SinBERT-large model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa. If you use this model, please cite *BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, LREC 2022*
|