File size: 318 Bytes
c270162
 
f0eaaed
 
c270162
 
1
2
3
4
5
6
7
---
license: mit
language:
- si
---
This is SinBERT-small model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa. If you use this model, please cite *BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, LREC 2022*