sabilmakbar
commited on
Commit
•
fab7320
1
Parent(s):
16c0c5b
Fix readme
Browse files
README.md
CHANGED
@@ -467,10 +467,12 @@ You may visit this [Wikipedia Dump Index](https://dumps.wikimedia.org/backup-ind
|
|
467 |
|
468 |
4. Run this ```sh``` script for extractions from Wikiedia HF:
|
469 |
```sh extract_raw_wiki_data_sea.sh```.
|
|
|
470 |
This script will run [_```extract_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/extract_raw_wiki_data.py) to construct the Wiki Dataset.
|
471 |
|
472 |
5. Run this ```sh``` script for deduplications from extracted data in Step 4:
|
473 |
```sh dedup_raw_wiki_data_sea.sh```.
|
|
|
474 |
This script will run [_```dedup_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/dedup_raw_wiki_data.py) to do Wiki Dataset Clenasing. Please note that the cleansing process can be language/dialect specific.
|
475 |
|
476 |
## Citation Info:
|
|
|
467 |
|
468 |
4. Run this ```sh``` script for extractions from Wikiedia HF:
|
469 |
```sh extract_raw_wiki_data_sea.sh```.
|
470 |
+
|
471 |
This script will run [_```extract_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/extract_raw_wiki_data.py) to construct the Wiki Dataset.
|
472 |
|
473 |
5. Run this ```sh``` script for deduplications from extracted data in Step 4:
|
474 |
```sh dedup_raw_wiki_data_sea.sh```.
|
475 |
+
|
476 |
This script will run [_```dedup_raw_wiki_data.py```_](https://huggingface.co/datasets/sabilmakbar/sea_wiki/blob/main/dedup_raw_wiki_data.py) to do Wiki Dataset Clenasing. Please note that the cleansing process can be language/dialect specific.
|
477 |
|
478 |
## Citation Info:
|