File size: 319 Bytes
bc20247 3bafda0 bc20247 9003022 bc20247 d4393c3 |
1 2 3 4 5 6 7 |
---
language:
- si
license:
- mit
---
This is SinBERT-large model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa. If you use this model, please cite *BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, LREC 2022* |