Back to all models
Model: DeepPavlov/rubert-base-cased

Monthly model downloads

DeepPavlov/rubert-base-cased DeepPavlov/rubert-base-cased
- downloads
last 30 days

pytorch

tf

Contributed by

DeepPavlov DeepPavlov MIPT university
6 models

How to use this model directly from the 🤗/transformers library:

			
Copy model
tokenizer = AutoTokenizer.from_pretrained("DeepPavlov/rubert-base-cased") model = AutoModelWithLMHead.from_pretrained("DeepPavlov/rubert-base-cased")

rubert-base-cased

RuBERT (Russian, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) was trained on the Russian part of Wikipedia and news data. We used this training data to build a vocabulary of Russian subtokens and took a multilingual version of BERT‑base as an initialization for RuBERT[1].

[1]: Kuratov, Y., Arkhipov, M. (2019). Adaptation of Deep Bidirectional Multilingual Transformers for Russian Language. arXiv preprint arXiv:1905.07213.