Dataset Preview
Go to dataset viewer
id (string)url (string)title (string)text (string)
"8157233"
"https://de.wikipedia.org/wiki/Romanian%20International%202014"
"Romanian International 2014"
"Die Romanian International 2014 fanden vom 13. bis zum 16. März 2014 in Timișoara statt. Es war die 16. Austragung dieser internationalen Meisterschaften von Rumänien im Badminton. Austragungsort Constantin-Jude-Sporthalle, 7 Ripensia Street Sieger und Platzierte Weblinks tournamentsoftware.com 2014 Badminton 2014 BE Circuit 2013/14 Badmintonwettbewerb in Timișoara"

Dataset Card for yaakov/wikipedia-de-splits

Dataset Description

The only goal of this dataset is to have random German Wikipedia articles at various dataset sizes: Small datasets for fast development and large datasets for statistically relevant measurements.

For this purpose, I loaded the 2665357 articles in the test set of the pre-processed German Wikipedia dump from 2022-03-01, randomly permuted the articles and created splits of sizes 2**n: 1, 2, 4, 8, .... The split names are strings. The split 'all' contains all 2665357 available articles.

Dataset creation

This dataset has been created with the following script:

!apt install git-lfs
!pip install -q transformers datasets

from huggingface_hub import notebook_login
notebook_login()

from datasets import load_dataset
wikipedia_de = load_dataset("wikipedia", "20220301.de")['train']

shuffled = wikipedia_de.shuffle(seed=42)

from datasets import DatasetDict
res = DatasetDict()

k, n = 0, 1
while n <= shuffled.num_rows:
    res[str(k)] = shuffled.select(range(n))
    k += 1; n *= 2
res['all'] = shuffled

res.push_to_hub('yaakov/wikipedia-de-splits')
Downloads last month
1
Edit dataset card
Evaluate models HF Leaderboard