sabilmakbar
commited on
Commit
•
16c0c5b
1
Parent(s):
b97eb6c
Fix readme
Browse files
README.md
CHANGED
@@ -460,11 +460,15 @@ You may visit this [Wikipedia Dump Index](https://dumps.wikimedia.org/backup-ind
|
|
460 |
|
461 |
### To replicate the whole dataset generation process ###
|
462 |
1. Set-up a new Python/Conda Environment (recommended Python version: 3.9.6 to 3.9.18 or 3.10.0 to 3.10.13) and install the requirements on ```requirements.txt``` use this codebase via ```pip install -r requirements.txt```.
|
|
|
463 |
2. Activate the chosen Python/Conda environment which the requirements are being installed.
|
|
|
464 |
3. Force install ```multiprocess==0.70.15``` by using ```pip install multiprocess==0.70.15``` to avoid [this issue](https://github.com/huggingface/datasets/issues/5613#issuecomment-1703169594) (there's no other workaround for now, esp for Python 3.10.x)
|
|
|
465 |
4. Run this ```sh``` script for extractions from Wikiedia HF:
|
466 |
```sh extract_raw_wiki_data_sea.sh```.
|
467 |
This script will run [_```extract_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/extract_raw_wiki_data.py) to construct the Wiki Dataset.
|
|
|
468 |
5. Run this ```sh``` script for deduplications from extracted data in Step 4:
|
469 |
```sh dedup_raw_wiki_data_sea.sh```.
|
470 |
This script will run [_```dedup_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/dedup_raw_wiki_data.py) to do Wiki Dataset Clenasing. Please note that the cleansing process can be language/dialect specific.
|
|
|
460 |
|
461 |
### To replicate the whole dataset generation process ###
|
462 |
1. Set-up a new Python/Conda Environment (recommended Python version: 3.9.6 to 3.9.18 or 3.10.0 to 3.10.13) and install the requirements on ```requirements.txt``` use this codebase via ```pip install -r requirements.txt```.
|
463 |
+
|
464 |
2. Activate the chosen Python/Conda environment which the requirements are being installed.
|
465 |
+
|
466 |
3. Force install ```multiprocess==0.70.15``` by using ```pip install multiprocess==0.70.15``` to avoid [this issue](https://github.com/huggingface/datasets/issues/5613#issuecomment-1703169594) (there's no other workaround for now, esp for Python 3.10.x)
|
467 |
+
|
468 |
4. Run this ```sh``` script for extractions from Wikiedia HF:
|
469 |
```sh extract_raw_wiki_data_sea.sh```.
|
470 |
This script will run [_```extract_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/extract_raw_wiki_data.py) to construct the Wiki Dataset.
|
471 |
+
|
472 |
5. Run this ```sh``` script for deduplications from extracted data in Step 4:
|
473 |
```sh dedup_raw_wiki_data_sea.sh```.
|
474 |
This script will run [_```dedup_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/dedup_raw_wiki_data.py) to do Wiki Dataset Clenasing. Please note that the cleansing process can be language/dialect specific.
|