ONNX Conversion of distilbert-base-cased-distilled-squad
DistilBERT base cased distilled SQuAD
This model is a fine-tune checkpoint of DistilBERT-base-cased, fine-tuned using (a second step of) knowledge distillation on SQuAD v1.1. This model reaches a F1 score of 87.1 on the dev set (for comparison, BERT bert-base-cased version reaches a F1 score of 88.7).
- Downloads last month
- 32,252
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support