Gottbert-base

BERT model trained solely on the German portion of the OSCAR data set.

Paper: GottBERT: a pure German Language Model

Authors: Raphael Scheible, Fabian Thomczyk, Patric Tippmann, Victor Jaravine, Martin Boeker

Downloads last month
20,734
Hosted inference API
Fill-Mask
Examples
Examples
Mask token: <mask>
This model can be loaded on the Inference API on-demand.