Ngit commited on
Commit
1a698d6
1 Parent(s): 04eb9fb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -1
README.md CHANGED
@@ -10,7 +10,35 @@ The original unquantized model can be found [here](https://huggingface.co/minuva
10
 
11
  The model contains two labels only (toxicity and severe toxicity). For the model with all labels refer to this [page](https://huggingface.co/minuva/MiniLMv2-toxic-jigsaw)
12
 
13
- # Usage
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
 
15
  ## Installation
16
 
 
10
 
11
  The model contains two labels only (toxicity and severe toxicity). For the model with all labels refer to this [page](https://huggingface.co/minuva/MiniLMv2-toxic-jigsaw)
12
 
13
+
14
+ # Optimum
15
+
16
+ ## Installation
17
+
18
+ Install from source:
19
+ ```bash
20
+ python -m pip install optimum[onnxruntime]@git+https://github.com/huggingface/optimum.git
21
+ ```
22
+
23
+
24
+ ## Run the Model
25
+ ```py
26
+ from optimum.onnxruntime import ORTModelForSequenceClassification
27
+ from transformers import AutoTokenizer, pipeline
28
+
29
+ model = ORTModelForSequenceClassification.from_pretrained('minuva/MiniLMv2-toxic-jigsaw-lite-onnx', provider="CPUExecutionProvider")
30
+ tokenizer = AutoTokenizer.from_pretrained('minuva/MiniLMv2-toxic-jigsaw-lite-onnx', use_fast=True, model_max_length=256, truncation=True, padding='max_length')
31
+
32
+ pipe = pipeline(task='text-classification', model=model, tokenizer=tokenizer, )
33
+ texts = ["This is pure trash",]
34
+ pipe(texts)
35
+ # [{'label': 'toxic', 'score': 0.6553249955177307}]
36
+ ```
37
+
38
+ # ONNX Runtime only
39
+
40
+ A lighter solution for deployment
41
+
42
 
43
  ## Installation
44