YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
---language: en-license: mit tags:- distilled- text-classification datasets:- custommetrics:- accuracy- f1- inference-time---
Distilled Model from BERT
This model is a knowledge distilled version of a fine-tuned BERT model.
Model description
- Distillation parameters: alpha=0.5, temperature=2.0
- Student model base: DistilBertForSequenceClassification
Performance Metrics
Model Size
- Teacher model: 416.43 MB
- Student model: 254.19 MB
- Size reduction: 38.96%
- Size ratio: 1.64x
Test Set 1 Performance
- Teacher accuracy: 0.8459
- Student accuracy: 0.7945
- Accuracy difference: -0.0514
- Teacher F1: 0.8344
- Student F1: 0.7728
- Speed improvement: 1.99x
Test Set 2 Performance
- Teacher accuracy: 0.6710
- Student accuracy: 0.6217
- Accuracy difference: -0.0493
- Teacher F1: 0.6571
- Student F1: 0.5998
- Speed improvement: 1.99x
Detailed Metrics - Test Set 1
- Precision: 0.7771
- Recall: 0.7945
- MCC: 0.3811
- Inference time per sample: 3.52 ms
- Samples per second: 284.23
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.