Edit model card

ruT5-large

Model was trained by SberDevices team.

  • Task: text2text generation
  • Type: encoder-decoder
  • Tokenizer: bpe
  • Dict size: 32 101
  • Num Parameters: 737 M
  • Training Data Volume 300 GB
Downloads last month
1,058
Hosted inference API
This model can be loaded on the Inference API on-demand.