How to use this model directly from the
from transformers import AutoTokenizer, AutoModelWithLMHead tokenizer = AutoTokenizer.from_pretrained("canwenxu/BERT-of-Theseus-MNLI") model = AutoModelWithLMHead.from_pretrained("canwenxu/BERT-of-Theseus-MNLI")
BERT-of-Theseus is a new compressed BERT by progressively replacing the components of the original BERT.
We provide a 6-layer pretrained model on MNLI as a general-purpose model, which can transfer to other sentence classification tasks, outperforming DistillBERT (with the same 6-layer structure) on six tasks of GLUE (dev set).
Please Note: this checkpoint is for Intermediate-Task Transfer Learning so it does not include the classification head for MNLI! Please fine-tune it before use (like DistilBERT).