German ELECTRA large generator
Released, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.
The generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-large.
Overview
Paper: here
Architecture: ELECTRA large (generator)
Language: German
Performance
GermEval18 Coarse: 80.70
GermEval18 Fine: 55.16
GermEval14: 88.95
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
Authors
Branden Chan: branden.chan [at] deepset.ai
Stefan Schweter: stefan [at] schweter.eu
Timo Möller: timo.moeller [at] deepset.ai
About us
deepset is the company behind the production-ready open-source AI framework Haystack.
Some of our other work:
- Distilled roberta-base-squad2 (aka "tinyroberta-squad2")
- German BERT, GermanQuAD and GermanDPR, German embedding model
- deepset Cloud, deepset Studio
Get in touch and join the Haystack community
For more info on Haystack, visit our GitHub repo and Documentation.
We also have a Discord community open to everyone!
Twitter | LinkedIn | Discord | GitHub Discussions | Website | YouTube
By the way: we're hiring!
- Downloads last month
- 58