Datasets:

Modalities:
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
Dask
License:
srehaag commited on
Commit
403d381
1 Parent(s): e0cfbf1

added some internal notes for next time

Browse files
Files changed (1) hide show
  1. rll-internal-instructions.md +44 -0
rll-internal-instructions.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ To use git lfs and huggingface hub:
2
+
3
+ Do once:
4
+
5
+ (1) install git lfs on machine
6
+ (2) pip install --upgrade huggingface_hub
7
+
8
+ Then:
9
+
10
+ (1) huggingface-cli login
11
+
12
+ Then:
13
+
14
+ (1) git lfs install
15
+ (2) git clone "https://huggingface.co/datasets/refugee-law-lab/canadian-legal-data"
16
+ (3) cd into repo
17
+
18
+ For specific new big files tracking using lfs (i.e. new / moved files)
19
+
20
+ (1) git lfs track "d:/xxx/xxx/bigfile.parquet"
21
+ (2) if over 5gb: huggingface-cli lfs-enable-largefiles .
22
+
23
+ When ready to push:
24
+
25
+ (1) git add .
26
+ (2) git commit -m "commit message here"
27
+ (3) git push
28
+
29
+ Notes:
30
+
31
+ - If making a new dataset, easiest to make first on huggingface (make sure to make dataset not model)
32
+
33
+ - If you add files in parquet, json, etc, huggingface will do everything automatically (make sure to include "train" in the file names)
34
+
35
+ - Note that the url includes /datasets/ (forgot that and it was confusing)
36
+
37
+ -
38
+
39
+
40
+
41
+
42
+
43
+
44
+