metadata
license: apache-2.0
datasets:
- sst2
- glue
This model is a fork of https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english , quantized using dynamic Post-Training Quantization (PTQ) with ONNX Runtime and 🤗 Optimum library.
It achieves 0.901 on the validation set.
To load this model:
from optimum.onnxruntime import ORTModelForSequenceClassification
model = ORTModelForSequenceClassification.from_pretrained("fxmarty/distilbert-base-uncased-finetuned-sst-2-english-int8-dynamic")