SinBERT-large / README.md
vinurad13's picture
Update README.md
d4393c3
|
raw
history blame
321 Bytes
---
languages:
- si
licenses:
- mit
---
This is SinBERT-large model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa. If you use this model, please cite *BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, LREC 2022*