peterdavidfagan commited on
Commit
8dbc877
·
verified ·
1 Parent(s): f7e9461

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -1
README.md CHANGED
@@ -8,4 +8,47 @@ task_categories:
8
 
9
  <video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/6018554e68258223ca22136f/6og2VldKOfp0Ci31h-r_w.mp4"></video>
10
 
11
- This dataset is used to train a transporter network for real-world pick/place within the RAD lab at the University of Edinburgh. The dataset is in TFDS format and was collected using [moveit2_data_collector](https://github.com/peterdavidfagan/moveit2_data_collector). In its current state the dataset is being tested as we are proving out this overall pipeline, keep monitoring this dataset and related repos for documentation updates.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
 
9
  <video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/6018554e68258223ca22136f/6og2VldKOfp0Ci31h-r_w.mp4"></video>
10
 
11
+ This dataset is used to train a transporter network for real-world pick/place within the RAD lab at the University of Edinburgh. The dataset is in TFDS format and was collected using [moveit2_data_collector](https://github.com/peterdavidfagan/moveit2_data_collector). In its current state the dataset is being tested as we are proving out this overall pipeline, keep monitoring this dataset and related repos for documentation updates.
12
+
13
+ # Download
14
+ An example of downloading and loading the dataset is given below, as larger datasets are uploaded this example script will change:
15
+
16
+ ```python
17
+ import os
18
+ import tarfile
19
+
20
+ import tensorflow_datasets as tfds
21
+ from huggingface_hub import hf_hub_download
22
+
23
+ DATA_DIR="/home/robot"
24
+ FILENAME="data.tar.xz"
25
+ EXTRACTED_FILENAME="data"
26
+ FILEPATH=os.path.join(DATA_DIR, FILENAME)
27
+ EXTRACTED_FILEPATH=os.path.join(DATA_DIR, EXTRACTED_FILENAME)
28
+
29
+ # download data from huggingface
30
+ hf_hub_download(
31
+ repo_id="peterdavidfagan/transporter_networks",
32
+ repo_type="dataset",
33
+ filename=FILENAME,
34
+ local_dir=DATA_DIR,
35
+ )
36
+
37
+ # uncompress file
38
+ with tarfile.open(FILEPATH, 'r:xz') as tar:
39
+ tar.extractall(path=DATA_DIR)
40
+ #os.rm(FILEPATH)
41
+
42
+ # load with tfds
43
+ ds = tfds.builder_from_directory(EXTRACTED_FILEPATH).as_dataset()['train']
44
+
45
+ # basic inspection of data
46
+ print(ds.element_spec)
47
+ for eps in ds:
48
+ print(eps["extrinsics"])
49
+ for step in eps["steps"]:
50
+ print(step["is_first"])
51
+ print(step["is_last"])
52
+ print(step["is_terminal"])
53
+ print(step["action"])
54
+ ```