Danish BERT (version 2, uncased) by Certainly (previously known as BotXO).

All credit goes to Certainly (previously known as BotXO), who developed Danish BERT. For data and training details see their GitHub repository or this article. You can also visit their organization page on Hugging Face.

It is both available in TensorFlow and Pytorch format.

The original TensorFlow version can be downloaded using this link.

Here is an example on how to load Danish BERT in PyTorch using the 🤗Transformers library:

from transformers import AutoTokenizer, AutoModelForPreTraining

tokenizer = AutoTokenizer.from_pretrained("Maltehb/danish-bert-botxo")
model = AutoModelForPreTraining.from_pretrained("Maltehb/danish-bert-botxo")
Downloads last month
13,342
Hosted inference API
Fill-Mask
Examples
Examples
Mask token: [MASK]
This model can be loaded on the Inference API on-demand.

Dataset used to train Maltehb/danish-bert-botxo