Edit model card

LionGuard

LionGuard is a classifier for detecting unsafe content in the Singapore context. It uses pre-trained BAAI English embeddings and performs classification with a trained Ridge Classification model. This classifier detects the presence of sexual content, defined as content meant to arouse sexual excitement, such as the description of sexual activity, or that promotes sexual services (excluding sex education and wellness). Sexual content that includes an individual who is under 18 years old.

Usage

  1. Install transformers , onnxruntime and huggingface_hub libraries.
pip install transformers onnxruntime huggingface_hub
  1. Run inference.
python inference.py '["Example text 1"]'
Downloads last month
24
Unable to determine this model’s pipeline type. Check the docs .