ruRoberta-large

Model was trained by SberDevices team.

  • Task: text2text generation
  • Type: encoder-decoder
  • Tokenizer: bpe
  • Dict size: 32 101
  • Num Parameters: 737 M
  • Training Data Volume 300 GB
New: fine-tune this model in a few clicks by selecting AutoNLP in the "Train" menu!
Downloads last month
1,492
Hosted inference API
Text2Text Generation
This model can be loaded on the Inference API on-demand.