ruSciBERT

Model was trained by Sber AI team and MLSA Lab of Institute for AI, MSU. If you use our model for your project, please tell us about it (nikgerasimenko@gmail.com).

Presentation at the AI Journey 2022

  • Task: mask filling
  • Type: encoder
  • Tokenizer: bpe
  • Dict size: 50265
  • Num Parameters: 123 M
  • Training Data Volume: 6.5 GB
Downloads last month
18,346
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support