Does this dataset contain all the data from Wikipedia?

#57
by TiamoLee - opened

For example, the en data is only 11GB in Parquet format. Is this the latest updated data or the full data set?

yes, the size is smaller because of preprocessing

I have a similar question since I noticed that it is written in the README that it is a train "subset" for each language. Is it just some sampling of wikipedia pages or all should be there? I would need all the pages on medical subjects. I am planning to filter the rest out for some application.

Wikimedia org
edited Feb 6

The dataset contains all the Wikipedia pages at the date of the dump.

Each date-language subset contains a single "train" split.

albertvillanova changed discussion status to closed

Although it is worth noting that the default preprocessing script removes all the pages that redirect to another page, among other things. @TiamoLee

Sign up or log in to comment