BERT-of-Theseus is a new compressed BERT by progressively replacing the components of the original BERT.
We provide a 6-layer pretrained model on MNLI as a general-purpose model, which can transfer to other sentence classification tasks, outperforming DistillBERT (with the same 6-layer structure) on six tasks of GLUE (dev set).
Please Note: this checkpoint is for Intermediate-Task Transfer Learning so it does not include the classification head for MNLI! Please fine-tune it before use (like DistilBERT).
- Downloads last month
Unable to determine this model’s pipeline type. Check the docs .