Missing articles in en dataset

by davidmezzetti - opened


I was excited to see an updated version of the Wikipedia dataset. Thank you for all the work in creating this dataset.

I was planning to use this as the new source for txtai-wikipedia. While I was able to rebuild with 20231101.en as the new input source, I noticed some inconsistencies when testing. It appears that some fairly popular articles are missing.

Here are a few examples that are missing.

It's possible this is a root issue with the upstream dataset. I used a modified version of the the olm/wikipedia project and those articles are in that dataset I manually built. I used the 20240101 dumps for this though, so again it's entirely possible it's an upstream issue.

I did notice one difference between the olm/wikipedia project and scripts found in the wikipedia project. olm/wikipedia removed this code.

-                    # Filter redirects.
-                    if raw_content is None or red_ is not None:
-                        beam.metrics.Metrics.counter(language, "filtered-redirects").inc()
-                        continue

Not sure if it's relevant but wanted to share.

Bumping this as i am encountering the same issue, here's a bunch more links that are not in the dataset that are in the lvl 5 vital articles: (there are 10k+ lvl 5 vital articles missing from the dump by my count)


can give the whole list for debugging if needed

Sign up or log in to comment