Back to all models
fill-mask mask_token: [MASK]
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint
								\$ curl -X POST \
https://api-inference.huggingface.co/models/canwenxu/BERT-of-Theseus-MNLI

Share

canwenxu/BERT-of-Theseus-MNLI
last 30 days

pytorch

tf

#### Contributed by

How to use this model directly from the 🤗/transformers library:



tokenizer = AutoTokenizer.from_pretrained("canwenxu/BERT-of-Theseus-MNLI")


Update on GitHub

# BERT-of-Theseus

BERT-of-Theseus is a new compressed BERT by progressively replacing the components of the original BERT.

## Load Pretrained Model on MNLI

We provide a 6-layer pretrained model on MNLI as a general-purpose model, which can transfer to other sentence classification tasks, outperforming DistillBERT (with the same 6-layer structure) on six tasks of GLUE (dev set).

Method MNLI MRPC QNLI QQP RTE SST-2 STS-B
BERT-base 83.5 89.5 91.2 89.8 71.1 91.5 88.9
DistillBERT 79.0 87.5 85.3 84.9 59.9 90.7 81.2
BERT-of-Theseus 82.1 87.5 88.8 88.8 70.1 91.8 87.8

Please Note: this checkpoint is for Intermediate-Task Transfer Learning so it does not include the classification head for MNLI! Please fine-tune it before use (like DistilBERT).