File size: 933 Bytes
bf24ef5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
language:
- en
tags:
- detoxification
licenses:
- cc-by-nc-sa
pipeline_tag: text2text-generation
---

**Model Overview**

It is a TT-compressed model of original BART-based detoxification model
[s-nlp/bart-base-detox][1].

**How to use**

```python
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model = AutoModelForSeq2SeqLM \
    .from_pretrained('s-nlp/bart-base-detox-ttd', trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained('facebook/bart-base')

toxics = ['that sick fuck is going to be out in 54 years.']
tokens = tokenizer(toxics)
tokens = model.generate(**tokens, num_return_sequences=1, do_sample=False,
                        temperature=1.0, repetition_penalty=10.0,
                        max_length=128, num_beams=5)
neutrals = tokenizer.decode(tokens[0, ...], skip_special_tokens=True)
print(neutrals) # stdout: She is going to be out in 54 years.
```

[1]: //s-nlp/bart-base-detox