ruT5-base
Model was trained by SberDevices team.
- Task:
text2text generation
- Type:
encoder-decoder
- Tokenizer:
bpe
- Dict size:
32 101
- Num Parameters:
222 M
- Training Data Volume
300 GB
- Downloads last month
- 2,096
Model was trained by SberDevices team.
text2text generation
encoder-decoder
bpe
32 101
222 M
300 GB