Back to all models
text-classification mask_token: [MASK]
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚡️ Upgrade your account to access the Inference API

							$
							curl -X POST \
-H "Authorization: Bearer YOUR_ORG_OR_USER_API_TOKEN" \
-H "Content-Type: application/json" \
-d '"json encoded string"' \
https://api-inference.huggingface.co/models/prajjwal1/bert-tiny-mnli
Share Copied link to clipboard

Monthly model downloads

prajjwal1/bert-tiny-mnli prajjwal1/bert-tiny-mnli
379 downloads
last 30 days

pytorch

tf

Contributed by

prajjwal1 Prajjwal
15 models

How to use this model directly from the 🤗/transformers library:

			
Copy to clipboard
from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("prajjwal1/bert-tiny-mnli") model = AutoModelForSequenceClassification.from_pretrained("prajjwal1/bert-tiny-mnli")
Uploaded in S3

The following model is a Pytorch pre-trained model obtained from converting Tensorflow checkpoint found in the official Google BERT repository. These BERT variants were introduced in the paper Well-Read Students Learn Better: On the Importance of Pre-training Compact Models. These models are trained on MNLI.

MNLI: 60%
MNLI-mm: 61.61%

These models were trained for 4 epochs.

@prajjwal_1