decahedron's picture
Update README.md
358e4d6
---
license: apache-2.0
datasets:
- jigsaw_unintended_bias
- jigsaw_toxicity_pred
language:
- en
metrics:
- accuracy
pipeline_tag: text-classification
---
# Lite Toxic Comment Classification
Lightweight ALBERT-based model for English toxic comment classification. Achieves a mean AUC score of 98.28 on the Jigsaw test set.