The update schema : update folder may contain the temporary dumps for each range (from end of last updates)
Make each file, not exceed 5GB by huggingface guidelines.

For the end of the year, maybe merge all files into main dataset / or yearly data

The each update file, may contain duplicates, since each crawling is independent.

This patch is temporary, before super-squash, to avoid downloading multiple tars from different hashes

nyanko7 changed pull request status to merged

Sign up or log in to comment