The dataset viewer is not available for this subset.
Exception: SplitsNotFoundError Message: The split names could not be parsed from the dataset config. Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 298, in get_dataset_config_info for split_generator in builder._split_generators( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/webdataset/webdataset.py", line 80, in _split_generators raise ValueError( ValueError: The TAR archives of the dataset should be in WebDataset format, but the files in the archive don't share the same prefix or the same types. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 65, in compute_split_names_from_streaming_response for split in get_dataset_split_names( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 352, in get_dataset_split_names info = get_dataset_config_info( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 303, in get_dataset_config_info raise SplitsNotFoundError("The split names could not be parsed from the dataset config.") from err datasets.inspect.SplitsNotFoundError: The split names could not be parsed from the dataset config.
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Feelsight : A visuo-tactile robot manipulation dataset
The FeelSight dataset is a dataset of vision, touch, and proprioception data collected from in-hand rotation of objects via an RL policy. It consists of a total of 70 experiments, 30 in the real-world and 40 in simulation, each lasting 30 seconds. For training neural field models with FeelSight, refer to the NeuralFeels repository.
Simulation data
Our simulated data is collected in IsaacGym with TACTO touch simulation in the loop.
Real-world data
Here's an example of real-world data from our three-camera setup and the DIGIT-Allegro hand:
Robot setup
The Allegro hand is mounted on the Franka Emika Panda robot. The hand is sensorized with DIGIT tactile sensors, and surrounded by three Intel RealSense cameras.
Dataset structure
For dataloaders, refer to the NeuralFeels repository.
feelsight/ # root directory, either feelsight or feelsight_real
βββ object_1/ # e.g. 077_rubiks_cube
β βββ 00/ # log directory
β β βββ allegro/ # tactile sensor data
β β β βββ index/ # finger id
β β β β βββ depth # only in sim, ground-truth
| | | | | βββ ..jpg
β β β β βββ image # RGB tactile images
| | | | | βββ ..jpg
β β β β βββ mask # only in sim, ground-truth
| | | | βββ ..jpg
β β β βββ ..
β β βββ realsense/ # RGB-D data
β β β βββ front-left/ # camera id
β β β β βββ image # RGB images
| | | | | βββ ..jpg
β β β β βββ seg # only in sim, ground-truth
| | | | | βββ ..jpg
β β β | βββ depth.npz # depth images
β β βββ object_1.mp4 # video of sensor stream
β β βββ data.pkl # proprioception data
β βββ ..
βββ object_2/
β βββ ..
βββ ..
Citation
If you find NeuralFeels useful in your research, please consider citing our paper:
@article{suresh2024neuralfeels,
title={{N}eural feels with neural fields: {V}isuo-tactile perception for in-hand manipulation},
author={Suresh, Sudharshan and Qi, Haozhi and Wu, Tingfan and Fan, Taosha and Pineda, Luis and Lambeta, Mike and Malik, Jitendra and Kalakrishnan, Mrinal and Calandra, Roberto and Kaess, Michael and Ortiz, Joseph and Mukadam, Mustafa},
journal={Science Robotics},
pages={adl0628},
year={2024},
publisher={American Association for the Advancement of Science}
}
- Downloads last month
- 43