Edit model card

Binary toxicity classifier for Ukrainian

This is the fine-tuned on the semi-automatically collected Ukrainian toxicity classification dataset "xlm-roberta-base" instance.

The evaluation metrics for binary toxicity classification on a test set are:

Metric Value
F1-score 0.99
Precision 0.99
Recall 0.99
Accuracy 0.99

How to use:

from transformers import pipeline

classifier = pipeline("text-classification",
                       model="ukr-detect/ukr-toxicity-classifier")
Downloads last month
22
Safetensors
Model size
278M params
Tensor type
F32
·

Dataset used to train ukr-detect/ukr-toxicity-classifier