Back to all models

Inference API has been turned off for this model.

Monthly model downloads

amine/bert-base-5lang-cased amine/bert-base-5lang-cased
last 30 days



Contributed by

amine Amine Abdaoui
1 model

How to use this model directly from the 🤗/transformers library:

Copy to clipboard
from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("amine/bert-base-5lang-cased") model = AutoModelForMaskedLM.from_pretrained("amine/bert-base-5lang-cased")


This is a smaller version of bert-base-multilingual-cased that handles only 5 languages (en, fr, es, de and zh) instead of 104. The model is therefore 30% smaller than the original one (124M parameters instead of 178M) but gives exactly the same representations for the above cited languages. Starting from bert-base-5lang-cased will facilitate the deployment of your model on public cloud platforms while keeping similar results. For instance, Google Cloud Platform requires that the model size on disk should be lower than 500 MB for serveless deployments (Cloud Functions / Cloud ML) which is not the case of the original bert-base-multilingual-cased.

For more information about the models size, memory footprint and loading time please refer to the table below:

Model Num parameters Size Memory Loading time
bert-base-multilingual-cased 178 million 714 MB 1400 MB 4.2 sec
bert-base-5lang-cased 124 million 495 MB 950 MB 3.6 sec

These measurements have been computed on a Google Cloud n1-standard-1 machine (1 vCPU, 3.75 GB).

How to use

from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("amine/bert-base-5lang-cased")
model = AutoModel.from_pretrained("amine/bert-base-5lang-cased")

How to cite

  title={Load What You Need: Smaller Versions of Mutlilingual BERT},
  author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
  booktitle={SustaiNLP / EMNLP},


Please contact for any question, feedback or request.