language: | |
- ru | |
tags: | |
- PyTorch | |
- Transformers | |
# rut5-base-detox-v2 | |
Model was fine-tuned from sberbank-ai/ruT5-base on parallel detoxification corpus. | |
* Task: `text2text generation` | |
* Type: `encoder-decoder` | |
* Tokenizer: `bpe` | |
* Dict size: `32 101` | |
* Num Parameters: `222 M` | |