The dataset viewer is not available for this subset.
Cannot get the split names for the config 'default' of the dataset.
Exception:    SplitsNotFoundError
Message:      The split names could not be parsed from the dataset config.
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 298, in get_dataset_config_info
                  for split_generator in builder._split_generators(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/webdataset/webdataset.py", line 80, in _split_generators
                  raise ValueError(
              ValueError: The TAR archives of the dataset should be in WebDataset format, but the files in the archive don't share the same prefix or the same types.
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 65, in compute_split_names_from_streaming_response
                  for split in get_dataset_split_names(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 352, in get_dataset_split_names
                  info = get_dataset_config_info(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 303, in get_dataset_config_info
                  raise SplitsNotFoundError("The split names could not be parsed from the dataset config.") from err
              datasets.inspect.SplitsNotFoundError: The split names could not be parsed from the dataset config.

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

Feelsight : A visuo-tactile robot manipulation dataset


The FeelSight dataset is a dataset of vision, touch, and proprioception data collected from in-hand rotation of objects via an RL policy. It consists of a total of 70 experiments, 30 in the real-world and 40 in simulation, each lasting 30 seconds. For training neural field models with FeelSight, refer to the NeuralFeels repository.

Simulation data

Our simulated data is collected in IsaacGym with TACTO touch simulation in the loop.

Real-world data

Here's an example of real-world data from our three-camera setup and the DIGIT-Allegro hand:

Robot setup

The Allegro hand is mounted on the Franka Emika Panda robot. The hand is sensorized with DIGIT tactile sensors, and surrounded by three Intel RealSense cameras.

Dataset structure

For dataloaders, refer to the NeuralFeels repository.

feelsight/ # root directory, either feelsight or feelsight_real
β”œβ”€β”€ object_1/ # e.g. 077_rubiks_cube
β”‚   β”œβ”€β”€ 00/ # log directory
β”‚   β”‚   β”œβ”€β”€ allegro/ # tactile sensor data
β”‚   β”‚   β”‚    β”œβ”€β”€ index/ # finger id
β”‚   β”‚   β”‚    β”‚    β”œβ”€β”€ depth # only in sim, ground-truth
|   |   |    |    |     └── ..jpg 
β”‚   β”‚   β”‚    β”‚    β”œβ”€β”€ image # RGB tactile images
|   |   |    |    |     └── ..jpg 
β”‚   β”‚   β”‚    β”‚    └── mask # only in sim, ground-truth
|   |   |    |          └── ..jpg  
β”‚   β”‚   β”‚    └── ..
β”‚   β”‚   β”œβ”€β”€ realsense/ # RGB-D data
β”‚   β”‚   β”‚    β”œβ”€β”€ front-left/ # camera id
β”‚   β”‚   β”‚    β”‚    β”œβ”€β”€ image # RGB images 
|   |   |    |    |     └── ..jpg
β”‚   β”‚   β”‚    β”‚    β”œβ”€β”€ seg # only in sim, ground-truth
|   |   |    |    |     └── ..jpg
β”‚   β”‚   β”‚    |    └── depth.npz # depth images
β”‚   β”‚   β”œβ”€β”€ object_1.mp4 # video of sensor stream
β”‚   β”‚   └── data.pkl # proprioception data
β”‚   └── ..
β”œβ”€β”€ object_2/
β”‚   └── ..
└── ..

Citation

If you find NeuralFeels useful in your research, please consider citing our paper:

@article{suresh2024neuralfeels,
  title={{N}eural feels with neural fields: {V}isuo-tactile perception for      in-hand manipulation},
  author={Suresh, Sudharshan and Qi, Haozhi and Wu, Tingfan and Fan, Taosha and Pineda, Luis and Lambeta, Mike and Malik, Jitendra and Kalakrishnan, Mrinal and Calandra, Roberto and Kaess, Michael and Ortiz, Joseph and Mukadam, Mustafa},
  journal={Science Robotics},
  pages={adl0628},
  year={2024},
  publisher={American Association for the Advancement of Science}
}
Downloads last month
43

Collection including suddhu/Feelsight