ruBert-large

Model was trained by SberDevices team.

  • Task: mask filling
  • Type: encoder
  • Tokenizer: bpe
  • Dict size: 120 138
  • Num Parameters: 427 M
  • Training Data Volume 30 GB
Downloads last month
519
Hosted inference API

Unable to determine this model’s pipeline type. Check the docs .