File size: 2,616 Bytes
fb6a9eb
c1f3888
 
 
 
 
 
fb6a9eb
c1f3888
fb6a9eb
c1f3888
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---

title: Content Classifier
emoji: ๐Ÿ”
colorFrom: blue
colorTo: purple
sdk: docker
pinned: false
license: mit
app_port: 7860
---


# Content Classifier API

A FastAPI-based content classification service using an ONNX model for threat detection and sentiment analysis.

## Features

- Content threat classification
- Sentiment analysis
- RESTful API with automatic documentation
- Health check endpoints
- Model information endpoints
- Docker support for easy deployment

## API Endpoints

- `POST /predict` - Classify text content
- `GET /` - API status
- `GET /health` - Health check
- `GET /model-info` - Model information
- `GET /docs` - Interactive API documentation (Swagger)

## Installation

1. Install dependencies:
```bash

pip install -r requirements.txt

```

2. Run the application:
```bash

python app.py

```

The API will be available at `http://localhost:8000`

## Usage

### Example Request

```bash

curl -X POST "http://localhost:8000/predict" \

     -H "Content-Type: application/json" \

     -d '{"text": "This is a sample text to classify"}'

```

### Example Response

```json

{

    "is_threat": false,

    "final_confidence": 0.75,

    "threat_prediction": 0.25,

    "sentiment_analysis": {

        "label": "POSITIVE",

        "score": 0.5

    },

    "onnx_prediction": {

        "threat_probability": 0.25,

        "raw_output": [[0.75, 0.25]]

    },

    "models_used": ["contextClassifier.onnx"],

    "raw_predictions": {

        "onnx": {

            "threat_probability": 0.25,

            "raw_output": [[0.75, 0.25]]

        },

        "sentiment": {

            "label": "POSITIVE",

            "score": 0.5

        }

    }

}

```

## Docker Deployment

1. Build the Docker image:
```bash

docker build -t content-classifier .

```

2. Run the container:
```bash

docker run -p 8000:8000 content-classifier

```

## Hugging Face Spaces Deployment

To deploy on Hugging Face Spaces:

1. Create a new Space on Hugging Face
2. Upload all files to your Space repository
3. The Space will automatically build and deploy

## Model Requirements

The ONNX model should accept text inputs and return classification predictions. You may need to adjust the preprocessing and postprocessing functions in `app.py` based on your specific model requirements.

## Configuration

You can modify the following in `app.py`:
- `MODEL_PATH`: Path to your ONNX model file
- `max_length`: Maximum text length for processing
- Preprocessing and postprocessing logic based on your model's requirements