BERTje: A Dutch BERT model

BERTje is a Dutch pre-trained BERT model developed at the University of Groningen.

⚠️ The new home of this model is the GroNLP organization.

BERTje now lives at: GroNLP/bert-base-dutch-cased

The model weights of the versions at wietsedv/ and GroNLP/ are the same, so do not worry if you use(d) wietsedv/bert-base-dutch-cased.

Downloads last month
25,787
Hosted inference API
Fill-Mask
Mask token: [MASK]
This model can be loaded on the Inference API on-demand.