stefan-it's picture
readme: nicer table
a3eaffa
|
raw
history blame
809 Bytes
metadata
license: cc-by-sa-3.0

German DBMDZ BERT Corpus

This datasets includes all corpora that were used for pretraining the German DBMDZ BERT Model.

It consists of Wikipedia dump and corpora from OPUS:

Filename Description Creation Date File Size
dewiki.txt Wikipedia Dump May 2019 5.1GB
eubookshop.txt OPUS EUbookshop November 2018 2.2GB
news.2018.txt OPUS News corpora January 2019 4.1GB
opensubtitles.txt OPUS OpenSubtitles November 2018 1.3GB
paracrawl.txt OPUS ParaCrawl November 2018 3.1GB