Released, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that this is the state of the art German language model.
Architecture: ELECTRA large (discriminator) Language: German
GermEval18 Coarse: 80.70 GermEval18 Fine: 55.16 GermEval14: 88.95
deepset/gbert-base deepset/gbert-large deepset/gelectra-base deepset/gelectra-large deepset/gelectra-base-generator deepset/gelectra-large-generator
branden.chan [at] deepset.ai
stefan [at] schweter.eu
timo.moeller [at] deepset.ai
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
By the way: we're hiring!
- Downloads last month
Unable to determine this model’s pipeline type. Check the docs .