Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

🦾 Heterogenous Pre-trained Transformers

Lirui Wang, Xinlei Chen, Jialiang Zhao, Kaiming He

Neural Information Processing Systems (Spotlight), 2024

You can find more details on our project page. An alternative clean implementation of HPT in Hugging Face can also be found here.

TL;DR: HPT aligns different embodiment to a shared latent space and investigates the scaling behaviors in policy learning. Put a scalable transformer in the middle of your policy and don’t train from scratch!

If you find HPT useful in your research, please consider citing:

@inproceedings{wang2024hpt,
author    = {Lirui Wang, Xinlei Chen, Jialiang Zhao, Kaiming He},
title     = {Scaling Proprioceptive-Visual Learning with Heterogeneous Pre-trained Transformers},
booktitle = {Neurips},
year      = {2024}
}

Contact

If you have any questions, feel free to contact me through email (liruiw@mit.edu). Enjoy!

Downloads last month
49
Inference API
Unable to determine this model’s pipeline type. Check the docs .