Back to all models
Model card Files and versions Use in transformers

Unable to determine this model’s pipeline type. Check the docs .

Monthly model downloads

deepset/gelectra-base deepset/gelectra-base
486 downloads
last 30 days

pytorch

tf

Contributed by

deepset deepset.ai company
21 models

German ELECTRA base

Released, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model. Our evaluation suggests that this model is somewhat undertrained. For best performance from a base sized model, we recommend deepset/gbert-base

Overview

Paper: here
Architecture: ELECTRA base (discriminator) Language: German

Performance

GermEval18 Coarse: 76.02
GermEval18 Fine:   42.22
GermEval14:        86.02

See also:
deepset/gbert-base deepset/gbert-large deepset/gelectra-base deepset/gelectra-large deepset/gelectra-base-generator deepset/gelectra-large-generator

Authors

Branden Chan: branden.chan [at] deepset.ai Stefan Schweter: stefan [at] schweter.eu Timo Möller: timo.moeller [at] deepset.ai

About us

deepset logo

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.

Some of our work:

Get in touch: Twitter | LinkedIn | Website