Datasets:

Modalities:
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
Dask
License:
canadian-legal-data / rll-internal-instructions.md
srehaag's picture
added some internal notes for next time
403d381
|
raw
history blame
884 Bytes

To use git lfs and huggingface hub:

Do once:

(1) install git lfs on machine (2) pip install --upgrade huggingface_hub

Then:

(1) huggingface-cli login

Then:

(1) git lfs install (2) git clone "https://huggingface.co/datasets/refugee-law-lab/canadian-legal-data" (3) cd into repo

For specific new big files tracking using lfs (i.e. new / moved files)

(1) git lfs track "d:/xxx/xxx/bigfile.parquet" (2) if over 5gb: huggingface-cli lfs-enable-largefiles .

When ready to push:

(1) git add . (2) git commit -m "commit message here" (3) git push

Notes:

  • If making a new dataset, easiest to make first on huggingface (make sure to make dataset not model)

  • If you add files in parquet, json, etc, huggingface will do everything automatically (make sure to include "train" in the file names)

  • Note that the url includes /datasets/ (forgot that and it was confusing)