Back to all models

Unable to determine this model’s pipeline type. Check the docs .

Monthly model downloads

chiragjn/small_bert_uncased_L-6_H-128_A-2 chiragjn/small_bert_uncased_L-6_H-128_A-2
last 30 days



Contributed by

chiragjn Chirag Jain
24 models

How to use this model directly from the 🤗/transformers library:

Copy to clipboard
from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("chiragjn/small_bert_uncased_L-6_H-128_A-2") model = AutoModel.from_pretrained("chiragjn/small_bert_uncased_L-6_H-128_A-2")
Uploaded in S3


This model was converted from the release of 24 smaller BERT models (English only, uncased, trained with WordPiece masking) referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models.

Conversion was performed automatically using transformers-cli convert as explained here

This model is also available on tfhub at

A small test was performed to check if converted model and the version at TFHub generate similar embeddings.