--- license: cc-by-nc-4.0 --- # Habitat Humanoids Habitat 3.0 provides support for diverse humanoid avatars, displaying different shapes an motions. Avatars are based on the [SMPL-X](https://smpl-x.is.tue.mpg.de/) body model format, a commonly used data-driven parametric human body model that provides a compact representation of 3D human shape and pose. This repository provides a set of stand-alone avatars and motion files to represent humanoids walking and reaching to objects in the Habitat simulator. However, you can also generate new humanoids using the SMPL-X code base, or use motions coming from motion capture or motion generation models. ## Contents We provide a total of 12 textured avatars of neutral, female and male gender and covering different body shapes. For each avatar, we provide a motion file that allows to drive the avatar to walk in a scene, or reach out to objects, using a [controller](https://github.com/facebookresearch/habitat-lab/blob/main/habitat-lab/habitat/articulated_agent_controllers/humanoid_rearrange_controller.py). The folder structure is as follows: ``` ├── habitat_humanoids │ ├── neutral_0 | | ├── neutral_0.ao_config.json | | ├── neutral_0.glb | | ├── neutral_0_motion_data_smplx.pkl | | ├── neutral_0.urdf │ ├── * │ ├── walk_motion │ | ├── CMU_10_04_stageii.npz ``` Where neutral_0 corresponds to the folder of one of the textured avatars. - `neutral_0.ao_config.json`: contains a dictionary with information on how to link the avatar armature and skinning, and the semantic id of the avatar, when using a semantic sensor. - `neutral_0.glb`: contains the skinning and texture information. - `neutral_0_motion_data_smplx.pkl`: contains relevant motion data files, more information below. - `neutral_0.urdf`: contains the armature, built automatically from the SMPL-X body model. - `walk_motion/CMU_10_04_stageii.npz`: contains a clip file from AMASS, used to build our motion file. ### Motion Data File For each avatar, we provide a dictionary stored in `*_motion_data_smplx.pkl` which contains information to animate the character to walk around a scene and reach out to different positions.In particular, the dictionary contains 3 keys to store this information. - `walk_motion`: contains a 130 frame clip of a person performing a walking cycle. In particular, the clip corresponds to the frames 300-430 of the file `CMU/10/10_04_stageii.npz` from AMASS dataset. We provide the raw data in this repository, released under a license detailed below. - `stop_pose`: contains a standing position, taken from s single frame from the motion clip above mentioned. - `left_hand`: Contains a grid of poses 48 generated using [VPoser](https://github.com/nghorbani/human_body_prior), where each pose is optimized to reach a given poisition in 3D. In [HumanoidRearrangeController](https://github.com/facebookresearch/habitat-lab/blob/main/habitat-lab/habitat/articulated_agent_controllers/humanoid_rearrange_controller.py), we provide code to interpolate over these poses to reach multiple 3D positions. - `right_hand`: Contains the same grid of poses to reach positions with the *right hand*. ## Usage Clone this file under `data/`. We provide several files in the [habitat-lab repository](https://github.com/facebookresearch/habitat-lab) to instantiate and move the avatars around the scene. ## License The 12 provided avatars, along with their textures, and the reaching positions stored in `left_hand` and `right_hand` are released under a CC-BY-NC 4.0 License. The motion data stored in `walk_motion` and `stop_pose`, as well as the original file `CMU_10_04_stageii.npz` is released under the [SMPL Body Motion File License](https://smpl.is.tue.mpg.de/bodylicense.html), a Creative Commons Attribution 4.0 International License. For support or inquiries about more SMPL Body Mo.on Files for commercial use, please contact info@meshcapade.com.