|
--- |
|
library_name: transformers |
|
tags: |
|
- detoxification |
|
- style_transfer |
|
license: mit |
|
datasets: |
|
- textdetox/multilingual_paradetox |
|
language: |
|
- en |
|
- ar |
|
- am |
|
- zh |
|
- uk |
|
- hi |
|
- es |
|
- ru |
|
- de |
|
metrics: |
|
- chrf |
|
pipeline_tag: text2text-generation |
|
--- |
|
|
|
# mT5-XL Detoxification Baseline |
|
|
|
This is a baseline detoxification model trained on released parallel corpus (dev part) of toxic texts [MultiParadetox](https://huggingface.co/datasets/textdetox/multilingual_paradetox) |
|
|
|
|
|
## Model Details |
|
|
|
The base model for this fine-tune is [mT5-xl](https://huggingface.co/google/mt5-xl). |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
## Citation |
|
|
|
The model is developed as a baseline for [TextDetox CLEF-2024](https://pan.webis.de/clef24/pan24-web/text-detoxification.html) shared task. |