distilbert-base-uncased-finetuned-sst-2-english

distilbert-base-uncased-finetuned-sst-2-english quantized with NNCF PTQ and exported to OpenVINO IR.

Model Description: This model reaches an accuracy of 90.0 on the validation set. See ov_config.json for the quantization config.

Usage example

To install the requirements for using the OpenVINO backend, do:

pip install optimum[openvino]

This installs all necessary dependencies, including Transformers and OpenVINO.

NOTE: Python 3.7-3.9 are supported. A virtualenv is recommended.

You can use this model with a Transformers pipeline.

from transformers import AutoTokenizer, pipeline
from optimum.intel.openvino import OVModelForSequenceClassification

model_id = "helenai/distilbert-base-uncased-finetuned-sst-2-english-ov-int8"
model = OVModelForSequenceClassification.from_pretrained(model_id)
tokenizer = AutoTokenizer.from_pretrained(model_id)
cls_pipe = pipeline("text-classification", model=model, tokenizer=tokenizer)
text = "OpenVINO is awesome!"
outputs = cls_pipe(text)
print(outputs)

Example output:

[{'label': 'POSITIVE', 'score': 0.9998594522476196}]
Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train helenai/distilbert-base-uncased-finetuned-sst-2-english-ov-int8