File size: 283 Bytes
dd6cb8d
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
---
language:
- ru
tags:
- PyTorch
- Transformers
---
# rut5-base-detox-v2
Model was fine-tuned from sberbank-ai/ruT5-base on parallel detoxification corpus.
* Task: `text2text generation`
* Type: `encoder-decoder`
* Tokenizer: `bpe`
* Dict size: `32 101`
* Num Parameters: `222 M`