File size: 584 Bytes
d06d0be c89e312 d06d0be |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
---
language:
- ru
---
# rubert-base-cased
RuBERT \(Russian, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters\) was trained on the Russian part of Wikipedia and news data. We used this training data to build a vocabulary of Russian subtokens and took a multilingual version of BERT‑base as an initialization for RuBERT\[1\].
08.11.2021: upload model with MLM and NSP heads
\[1\]: Kuratov, Y., Arkhipov, M. \(2019\). Adaptation of Deep Bidirectional Multilingual Transformers for Russian Language. arXiv preprint [arXiv:1905.07213](https://arxiv.org/abs/1905.07213).
|