Code repository for the paper: Redefining Data Pairing for Motion Retargeting Leveraging a Human Body Prior Xiyana Figuera, Soogeun Park, and Hyemin Ahn
Our paper has been accepted @ IROS 2024! ๐
Project Page: ๐ | Arxiv Link: ๐
conda create -n mr-hubo python=3.8
conda activate mr-hubo
conda install pytorch-cuda=11.8 cuda-toolkit=11.8 -c pytorch -c nvidia # change the cuda version to match the version on your computergit clone https://github.com/ahrilab/MR-HuBo.git
cd MR-HuBo
pip install -r requirements.txtYou can download SMPL-X & VPoser Model via this link.
We use the 'smplx neutral' model, and 'vposer_v2_05'.
Please make sure that you put the bodymodel/smplx/neutral.npz and vposer_v2_05/ into the data/ folder.
You can download ground truth motions of robots via this link.
Please move this 'mr_gt.pkl' file into 'data/gt_motions/' path.
You can download AMASS dataset via this link.
Please download the 'CMU/SMPL-X N' data from the downloads tab.
Please move the motion files (e.g. 02_05_stageii.npz) that we use for the ground truth into 'data/gt_motions/amass_data/'. You can see the motions used for GT in 'data/gt_motions/README.md'.
- data: Store the data of Robot's urdf, meshes, and motions data, VPoser & SMPL model, GT motions data.
- imgs: Images for README.md
- out: Outputs of code such as model weights, predicted motions, rendered videos, etc.
- src: Fragmented Codes, each individual file is responsible for a single function.
- tools: Integrated codes that perform each feature.
python tools/generate_data.py -r [robot_type] -s [num_seeds] -p [poses_per_seed] -d [device] -i [restart_idx]
# example
python tools/generate_data.py -r COMANpython tools/train.py -r [robot_type] [-d <device>] [-n <num_data>] [-ef] [-os] [-w]
# example
python tools/train.py -r REACHY -ef -os -w
python tools/train.py -r COMAN -ef -d cuda:2python tools/evaluate_model.py -r ROBOT_TYPE [-ef] [-os] [-d DEVICE] [-em EVALUATE_MODE]
# Example
python tools/evaluate_model.py -r REACHY
python tools/evaluate_model.py -r REACHY -ef -os -d cuda -em joint# Usage:
python tools/render_robot_motion.py -r ROBOT_TYPE -mi MOTION_IDX [-ef] -e EXTENTION --fps FPS [-s] # for pred_motion
python tools/render_robot_motion.py -r ROBOT_TYPE -gt -mi MOTION_IDX -e EXTENTION --fps FPS [-s] # for gt_motion
# Example:
# render for prediction motion
python tools/render_robot_motion.py -r COMAN -mi 13_08 -ef -e mp4 --fps 120 -s
python tools/render_robot_motion.py -r COMAN -mi 13_18 -e mp4 --fps 120 -s
# render for GT motion
python tools/render_robot_motion.py -r=COMAN -gt -mi="13_08" -e mp4 --fps 120 -s
python tools/render_robot_motion.py -r=COMAN -gt -mi="13_18" -e mp4 --fps 120Mr. HuBo is general method which can be adapted to any humanoid robots, if a URDF (unified robot description format) of robot and scale factor for converting robot's position into SMPL position is given.
Parts of the code are taken or adapted from the following repos:
- human-body-prior
- pymaf-x
- body-visualizer
@inproceedings{MR_HuBo:2024,
title = {Redefining Data Pairing for Motion Retargeting Leveraging Human Body Prior},
author = {Figuera, Xiyana and Park, Soogeun and Ahn, Hyemin.},
year = 2024,
month = october,
booktitle = {2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
event_place = {Abu Zaby (Abu Dhabi), United Arab Emirates},
month_numeric = 10
}