TactileTracking: A tactile-based object tracking dataset
We present a benchmark dataset for tactile-based object tracking, featuring 12 distinct objects and 84 tracking trials—7 trials per object, each lasting an average of 10.2 seconds. The dataset includes tactile video, per-frame 6DoF ground truth sensor poses, and pre-processed surface geometry constructed from each tactile video frame. For a robust, real-time, and accurate tactile-based object tracking solution, explore our work NormalFlow. To compare NormalFlow with other methods on this dataset,use the NormalFlow Experiments repository.
Collection Setup
The dataset includes 12 objects across three categories: seven everyday items (four from the YCB dataset [1]), two small textured objects, and three geometric shapes, shown in the left figure below. Data collection involves fixing each object on a workbench and using a Motion Capture System to track the sensor's pose when in contact, as illustrated in the right figure below.
Dataset Structure
The dataset is collected using the GelSight Mini sensor (without markers). For each object, seven tracking trials are collected, each initiated at varying contact locations, as shown in the figure below.
Each data collection trial directory contains the following components:
- gelsight.avi: Tactile video collected during the trial, containing N frames.
- webcam.avi: Third-person view video capturing the data collection process.
- true_start_T_currs.npy: An (N, 4, 4) array representing the sensor’s 6DoF pose for each tactile frame in
gelsight.mp4
, formatted as homogeneous transformation matrices. - contact_masks.npy: An (N, H, W) array of the computed contact masks for each frame in
gelsight.mp4
, derived solely from the tactile images. - gradient_maps.npy: An (N, H, W, 2) array of the computed gradient maps for each frame in
gelsight.mp4
, based only on the tactile images.
Dataset Statistics
Our benchmark dataset focuses on frame-to-frame object pose tracking, with each trial ensuring overlap between the first (reference) frame and subsequent (target) frames. This setup restricts the object to local movement without long-distance shifts. The table below details the average 6DoF movement range for each object. This dataset prioritizes rotational movement, as excessive translational sliding risks damaging the sensor’s gel.
Cite Us
If you find this dataset useful, please consider citing our paper:
@ARTICLE{huang2024normalflow,
author={Huang, Hung-Jui and Kaess, Michael and Yuan, Wenzhen},
journal={IEEE Robotics and Automation Letters},
title={NormalFlow: Fast, Robust, and Accurate Contact-based Object 6DoF Pose Tracking with Vision-based Tactile Sensors},
year={2024},
volume={},
number={},
pages={1-8},
keywords={Force and Tactile Sensing, 6DoF Object Tracking, Surface Reconstruction, Perception for Grasping and Manipulation},
doi={10.1109/LRA.2024.3505815}}
Reference
[1] B. Calli, A. Singh, J. Bruce, A. Walsman, K. Konolige, S. Srinivasa, P. Abbeel, and A. M. Dollar, “Yale-cmu-berkeley dataset for robotic manipulation research,” The International Journal of Robotics Research, vol. 36, no. 3, pp. 261–268, 2017.
- Downloads last month
- 56