Back to all models

Unable to determine this model’s pipeline type. Check the docs .

Monthly model downloads

DeepPavlov/bert-base-multilingual-cased-sentence DeepPavlov/bert-base-multilingual-cased-sentence
last 30 days



Contributed by

DeepPavlov DeepPavlov MIPT university
6 models

How to use this model directly from the 🤗/transformers library:

Copy to clipboard
from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("DeepPavlov/bert-base-multilingual-cased-sentence") model = AutoModel.from_pretrained("DeepPavlov/bert-base-multilingual-cased-sentence")


Sentence Multilingual BERT (101 languages, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) is a representation‑based sentence encoder for 101 languages of Multilingual BERT. It is initialized with Multilingual BERT and then fine‑tuned on english MultiNLI[1] and on dev set of multilingual XNLI[2]. Sentence representations are mean pooled token embeddings in the same manner as in Sentence‑BERT[3].

[1]: Williams A., Nangia N. & Bowman S. (2017) A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference. arXiv preprint arXiv:1704.05426

[2]: Williams A., Bowman S. (2018) XNLI: Evaluating Cross-lingual Sentence Representations. arXiv preprint arXiv:1809.05053

[3]: N. Reimers, I. Gurevych (2019) Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks. arXiv preprint arXiv:1908.10084