rubert-base-cased / README.md
julien-c's picture
julien-c HF staff
Migrate model card from transformers-repo
d06d0be
|
raw
history blame
536 Bytes
metadata
language:
  - ru

rubert-base-cased

RuBERT (Russian, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) was trained on the Russian part of Wikipedia and news data. We used this training data to build a vocabulary of Russian subtokens and took a multilingual version of BERT‑base as an initialization for RuBERT[1].

[1]: Kuratov, Y., Arkhipov, M. (2019). Adaptation of Deep Bidirectional Multilingual Transformers for Russian Language. arXiv preprint arXiv:1905.07213.