--- language: - en tags: - detoxification licenses: - cc-by-nc-sa pipeline_tag: text2text-generation --- **Model Overview** It is a TT-compressed model of original BART-based detoxification model [s-nlp/bart-base-detox][1]. **How to use** ```python from transformers import AutoModelForSeq2SeqLM, AutoTokenizer model = AutoModelForSeq2SeqLM \ .from_pretrained('s-nlp/bart-base-detox-ttd', trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained('facebook/bart-base') toxics = ['that sick fuck is going to be out in 54 years.'] tokens = tokenizer(toxics) tokens = model.generate(**tokens, num_return_sequences=1, do_sample=False, temperature=1.0, repetition_penalty=10.0, max_length=128, num_beams=5) neutrals = tokenizer.decode(tokens[0, ...], skip_special_tokens=True) print(neutrals) # stdout: She is going to be out in 54 years. ``` [1]: //s-nlp/bart-base-detox