Tristan commited on
Commit
817c3cd
1 Parent(s): 27d1c6a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -670,12 +670,12 @@ The datasets are built from the Wikipedia dump
670
  contains the content of one full Wikipedia article with cleaning to strip
671
  markdown and unwanted sections (references, etc.).
672
 
673
- The articles are parsed using the ``mwparserfromhell`` tool.
674
 
675
- To load this dataset you need to install ``mwparserfromhell`` first:
676
 
677
  ```
678
- pip install mwparserfromhell
679
  ```
680
 
681
  Then, you can load any subset of Wikipedia per language and per date this way:
670
  contains the content of one full Wikipedia article with cleaning to strip
671
  markdown and unwanted sections (references, etc.).
672
 
673
+ The articles are parsed using the ``mwparserfromhell`` tool, and we use ``multiprocess`` for parallelization.
674
 
675
+ To load this dataset you need to install these first:
676
 
677
  ```
678
+ pip install mwparserfromhell==0.6.4 multiprocess==0.70.13
679
  ```
680
 
681
  Then, you can load any subset of Wikipedia per language and per date this way: