sabilmakbar commited on
Commit
ef57624
1 Parent(s): a056553

Fix readme

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -462,10 +462,12 @@ You may visit this [Wikipedia Dump Index](https://dumps.wikimedia.org/backup-ind
462
  1. Set-up a new Python/Conda Environment (recommended Python version: 3.9.6 to 3.9.18 or 3.10.0 to 3.10.13) and install the requirements on ```requirements.txt``` use this codebase via ```pip install -r requirements.txt```.
463
  2. Activate the chosen Python/Conda environment which the requirements are being installed.
464
  3. Force install ```multiprocess==0.70.15``` by using ```pip install multiprocess==0.70.15``` to avoid [this issue](https://github.com/huggingface/datasets/issues/5613#issuecomment-1703169594) (there's no other workaround for now, esp for Python 3.10.x)
465
- 4. Run this ```sh``` script for extractions from Wikimedia Dump:
466
- ```sh extract_raw_wiki_data_sea.sh```. This script will run [_```extract_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/extract_raw_wiki_data.py) to construct the Wiki Dataset
 
467
  5. Run this ```sh``` script of deduplication:
468
- ```sh dedup_raw_wiki_data_sea.sh```. This script will run [_```dedup_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/dedup_raw_wiki_data.py) to do Wiki Dataset Clenasing. Please note that the cleansing process can be language/dialect specific.
 
469
 
470
  ## Citation Info:
471
  ```
 
462
  1. Set-up a new Python/Conda Environment (recommended Python version: 3.9.6 to 3.9.18 or 3.10.0 to 3.10.13) and install the requirements on ```requirements.txt``` use this codebase via ```pip install -r requirements.txt```.
463
  2. Activate the chosen Python/Conda environment which the requirements are being installed.
464
  3. Force install ```multiprocess==0.70.15``` by using ```pip install multiprocess==0.70.15``` to avoid [this issue](https://github.com/huggingface/datasets/issues/5613#issuecomment-1703169594) (there's no other workaround for now, esp for Python 3.10.x)
465
+ 4. Run this ```sh``` script for extractions from Wikiedia HF:
466
+ ```sh extract_raw_wiki_data_sea.sh```.
467
+ This script will run [_```extract_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/extract_raw_wiki_data.py) to construct the Wiki Dataset.
468
  5. Run this ```sh``` script of deduplication:
469
+ ```sh dedup_raw_wiki_data_sea.sh```.
470
+ This script will run [_```dedup_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/dedup_raw_wiki_data.py) to do Wiki Dataset Clenasing. Please note that the cleansing process can be language/dialect specific.
471
 
472
  ## Citation Info:
473
  ```