ruT5-base

Model was trained by SberDevices team.

  • Task: text2text generation
  • Type: encoder-decoder
  • Tokenizer: bpe
  • Dict size: 32 101
  • Num Parameters: 222 M
  • Training Data Volume 300 GB
New

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
4,276
Hosted inference API
Text2Text Generation
This model can be loaded on the Inference API on-demand.