Back to all models
text-classification mask_token: [MASK]
Query this model
πŸ”₯ This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚑️ Upgrade your account to access the Inference API

Share Copied link to clipboard

Monthly model downloads

prajjwal1/bert-medium-mnli prajjwal1/bert-medium-mnli
last 30 days



Contributed by

prajjwal1 Prajjwal
15 models

How to use this model directly from the πŸ€—/transformers library:

Copy to clipboard
from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("prajjwal1/bert-medium-mnli") model = AutoModelForSequenceClassification.from_pretrained("prajjwal1/bert-medium-mnli")
Uploaded in S3

The following model is a Pytorch pre-trained model obtained from converting Tensorflow checkpoint found in the official Google BERT repository. These BERT variants were introduced in the paper Well-Read Students Learn Better: On the Importance of Pre-training Compact Models. These models are trained on MNLI.

MNLI: 75.86%
MNLI-mm: 77.03%

These models are trained for 4 epochs.