ruT5-large
Model was trained by SberDevices team.
- Task:
text2text generation
- Type:
encoder-decoder
- Tokenizer:
bpe
- Dict size:
32 101
- Num Parameters:
737 M
- Training Data Volume
300 GB
- Downloads last month
- 1,058
Model was trained by SberDevices team.
text2text generation
encoder-decoder
bpe
32 101
737 M
300 GB