Contents's picture
Create README.md
7b17a80
metadata
license: apache-2.0
language:
  - en
library_name: transformers
pipeline_tag: fill-mask
datasets:
  - wikipedia

BERT base model (uncased)

Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English.

Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face team.