Back to all models
Model card Files and versions Use in transformers
fill-mask mask_token: [MASK]
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚡️ Upgrade your account to access the Inference API

Share Copied link to clipboard

Monthly model downloads

deepset/gelectra-large-generator deepset/gelectra-large-generator
last 30 days



Contributed by

deepset company
21 models

German ELECTRA large generator

Released, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.

The generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-large.


Paper: here
Architecture: ELECTRA large (generator)
Language: German


GermEval18 Coarse: 80.70
GermEval18 Fine:   55.16
GermEval14:        88.95

See also:
deepset/gbert-base deepset/gbert-large deepset/gelectra-base deepset/gelectra-large deepset/gelectra-base-generator deepset/gelectra-large-generator


Branden Chan: branden.chan [at] Stefan Schweter: stefan [at] Timo Möller: timo.moeller [at]

About us

deepset logo

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.

Some of our work:

Get in touch: Twitter | LinkedIn | Website