FoundationTactile / README.md
alanz-mit's picture
Update README.md
a0b092b verified
metadata
license: mit

Foundation Tactile (FoTa) - a multi-sensor multi-task large dataset for tactile sensing

This repository stores the FoTa dataset and the pretrained checkpoints of Transferable Tactile Transformers (T3).

Paper Code Colab

[Project Website]

Jialiang (Alan) Zhao, Yuxiang Ma, Lirui Wang, and Edward H. Adelson

MIT CSAIL

Overview

FoTa was released with Transferable Tactile Transformers (T3) as a large dataset for tactile representation learning. It aggregates some of the largest open-source tactile datasets, and it is released in a unified WebDataset format.

Fota contains over 3 million tactile images collected from 13 camera-based tactile sensors and 11 tasks.

File structure

After downloading and unzipping, the file structure of FoTa looks like:

dataset_1
    |---- train
        |---- count.txt
        |---- data_000000.tar
        |---- data_000001.tar
        |---- ...
    |---- val
        |---- count.txt
        |---- data_000000.tar
        |---- ...
dataset_2
:
dataset_n

Each .tar file is one sharded dataset. At runtime, wds (WebDataset) api automatically loads, shuffles, and unpacks all shards on demand. The nicest part of having a .tar file, instead of saving all raw data into matrices (e.g. .npz for zarr), is that .tar is easy to visualize without the need of any code. Simply double click on any .tar file to check its content.

Although you will never need to unpack a .tar manually (wds does that automatically), it helps to understand the logic and file structure.

data_000000.tar
    |---- file_name_1.jpg
    |---- file_name_1.json
    :
    |---- file_name_n.jpg
    |---- file_name_n.json

The .jpg files are tactile images, and the .json files store task-specific labels.

For more details on operations of the paper, checkout our GitHub repository and Colab tutorial.

Getting started

Checkout our Colab for a step-by-step tutorial!

Download and unpack

Download either with the web interface or using the python interface:

pip install huggingface_hub

then inside a python script or in ipython, run the following:

from huggingface_hub import snapshot_download

snapshot_download(repo_id="alanz-mit/FoundationTactile", repo_type="dataset", local_dir=".", local_dir_use_symlinks=False)

To unpack the dataset which has been split into many .zip files:

cd dataset
zip -s 0 FoTa_dataset.zip --out unsplit_FoTa_dataset.zip
unzip unsplit_FoTa_dataset.zip

Citation

@article{zhao2024transferable,
      title={Transferable Tactile Transformers for Representation Learning Across Diverse Sensors and Tasks}, 
      author={Jialiang Zhao and Yuxiang Ma and Lirui Wang and Edward H. Adelson},
      year={2024},
      eprint={2406.13640},
      archivePrefix={arXiv},
}

MIT License.