Skip to content

Li-xingXiao/272-dim-Motion-Representation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

35 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Processing scripts of 272-dim Motion Representation

Visualization IK

๐Ÿ”„ Left: Our Representation; Right: IK failure.

We refine the motion representation to enable directly conversion from joint rotations to SMPL body parameters, removing the need of Inverse Kinematics (IK) operation.

๐Ÿ”ฅ News

  • [2025-07] Go to Zero has been selected as ICCV 2025 HighLight! ๐ŸŽ‰
  • [2025-06] Two exciting papers using the 272-dim motion representation in this repo: MotionStreamer and Go to Zero have been accepted to ICCV 2025! ๐ŸŽ‰

๐Ÿ“ฎ Change Log

๐Ÿ“ข 2025-04-04 --- Release the processed 272-dim Motion Representation of HumanML3D dataset. Only for academic usage.

๐Ÿ“ข 2025-03-28 --- Release the evaluation code and the Quantitative comparison results of recovery from joint rotations and joint positions respectively.

๐Ÿ“ข 2025-03-13 --- Release the processing scripts to obtain the modified 272-dim motion representation and the Qualitative results of recovery from joint rotations and joint positions respectively.

๐Ÿš€ Getting Started

๐Ÿ Python Virtual Environment

conda env create -f environment.yaml
conda activate mgpt

๐Ÿ“ฅ Data Preparation

โฌ‡๏ธ Download AMASS data
  • For HumanML3D, BABEL, and KIT-ML dataset usage:
    • Download all "SMPL-H G" motions from the AMASS website
    • Place them in datasets/amass_data
  • For Motion-X usage:
    • Download all SMPL-X G
    • Place them in datasets/amass_data_smplx
๐Ÿค– Download SMPL+H and DMPL model
  1. Download SMPL+H (Extended SMPL+H model used in AMASS project)
  2. Download DMPL (DMPLs compatible with SMPL)
  3. Place all models under ./body_model/
๐Ÿ‘ค Download human model files
  1. Download files from Google Drive
  2. Place under ./body_model/
โš™๏ธ Process AMASS data
python amass_process.py --index_path ./test_t2m.csv --save_dir ./output/smpl_85
๐Ÿ“ Generate mapping files and text files
  • Follow UniMoCap Step2 to get:

    1. Mapping files (.csv)
    2. Text files (./{dataset}_new_text)

    (Note: Remember to set fps=30 in the h3d_to_h3d.py file.)

๐Ÿƒ Quick Start Guide

1. Transform SMPL to Z+ direction

python face_z_transform.py --filedir ./output

2. Get global joint positions through SMPL layer

python infer_get_joints.py --filedir ./output

3. Generate 272-dimensional motion representation

python representation_272.py --filedir ./output

4. Calculate Mean and Std (Optional)

We provide 272-dimentional Mean.npy and Std.npy of HumanML3D dataset under folder "mean_std/".

python cal_mean_std.py --input_dir ./output/Representation_272 --output_dir ./mean_std

5. Visualize representation (Optional)

Recover from rotation:

python recover_visualize.py --mode rot --input_dir ./output/Representation_272 --output_dir ./visualize_result

Recover from position:

python recover_visualize.py --mode pos --input_dir ./output/Representation_272 --output_dir ./visualize_result

6. Representation_272 to BVH conversion (Optional)

python representation_272_to_bvh.py --gender NEUTRAL --poses ./output/Representation_272 --output ./output/Representation_272 --fps 30 --is_folder

๐Ÿ“– Evaluation (Optional)

To make our 272-dim motion representation more persuasive, we provide Quantitative comparison results. Our goal is to obtain SMPL rotations for further usage (e.g. convert to BVH), so we evaluate the following 2 ways (Directly vs. IK) to recover SMPL rotations.

We provide Quantitative comparison between the SMPL rotations recovered from:
(1) joint rotations ([8+6*22 : 8+12*22] in our 272 representaion. Directly recover, No need IK).
(2) joint positions ([8 : 8+3*22] in our 272 representaion, Need IK: position -> rotation).
We refer to MoMask for the IK implementation.

We use angle error (geodesic distance) between the GT SMPL rotations and the recovered rotations (minimum angle between rotations) as the metric.
GT: The data (85-dim) after running Step 1 in the Quick Start Guide is used as GT ([ :22*3] denotes SMPL rotations).

We evaluate:
(1) Average and Max joint errors across all files (marked as Emean and Emax).
(2) Average joint errors across all files (marked as E0, E1,..., E21).
Evaluation is down on the HumanML3D dataset (processed by our scripts).

Evaluation of recovery from rotation (Directly, No need IK):

python cal_angle_error.py --mode rot

Evaluation of recovery from position (Need IK: position -> rotation):

python cal_angle_error.py --mode pos

๐Ÿ“ Evaluation Results

image

๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ The errors of Directly recovery from joint rotations (No Need IK) is significantly lower than those of recovery from joint positions (Need IK: position -> rotation)!

๐ŸŽฌ Visualization Results

Recover from rotation Recover from position

Left: Recover from rotation ย ย ย ย  Right: Recover from position

๐Ÿค— Processed 272-dim Motion Representation

To facilitate researchers, we provide the processed 272-dim Motion Representation of:

HumanML3D dataset at this link.

BABEL dataset at this link.

โ—๏ธโ—๏ธโ—๏ธ The processed data is solely for academic purposes. Make sure you read through the AMASS License and BABEL License.

  1. Download the processed 272-dim HumanML3D dataset following:
huggingface-cli download --repo-type dataset --resume-download lxxiao/272-dim-HumanML3D --local-dir ./humanml3d_272
cd ./humanml3d_272
unzip texts.zip
unzip motion_data.zip

The dataset is organized as:

./humanml3d_272
  โ”œโ”€โ”€ mean_std
      โ”œโ”€โ”€ Mean.npy
      โ”œโ”€โ”€ Std.npy
  โ”œโ”€โ”€ split
      โ”œโ”€โ”€ train.txt
      โ”œโ”€โ”€ val.txt
      โ”œโ”€โ”€ test.txt
  โ”œโ”€โ”€ texts
      โ”œโ”€โ”€ 000000.txt
      ...
  โ”œโ”€โ”€ motion_data
      โ”œโ”€โ”€ 000000.npy
      ...
  1. Download the processed 272-dim BABEL dataset following:
huggingface-cli download --repo-type dataset --resume-download lxxiao/272-dim-BABEL --local-dir ./babel_272
cd ./babel_272
unzip texts.zip
unzip motion_data.zip

The dataset is organized as:

./babel_272
  โ”œโ”€โ”€ t2m_babel_mean_std
      โ”œโ”€โ”€ Mean.npy
      โ”œโ”€โ”€ Std.npy
  โ”œโ”€โ”€ split
      โ”œโ”€โ”€ train.txt
      โ”œโ”€โ”€ val.txt
  โ”œโ”€โ”€ texts
      โ”œโ”€โ”€ 000000.txt
      ...
  โ”œโ”€โ”€ motion_data
      โ”œโ”€โ”€ 000000.npy
      ...

๐ŸŒน Acknowledgement

This repository builds upon the following awesome datasets and projects:

๐Ÿ“š License

This codebase is released under the MIT License.
Please note that it also relies on external libraries and datasets, each of which may be subject to their own licenses and terms of use.

๐Ÿค๐Ÿผ Citation

The following exciting works use the 272-dim motion representation in this repo.

If our project is helpful for your research, please consider citing :

ICCV 2025:

@InProceedings{Xiao_2025_ICCV,
    author    = {Xiao, Lixing and Lu, Shunlin and Pi, Huaijin and Fan, Ke and Pan, Liang and Zhou, Yueer and Feng, Ziyong and Zhou, Xiaowei and Peng, Sida and Wang, Jingbo},
    title     = {MotionStreamer: Streaming Motion Generation via Diffusion-based Autoregressive Model in Causal Latent Space},
    booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
    month     = {October},
    year      = {2025},
    pages     = {10086-10096}
}

ICCV 2025 HighLight:

@InProceedings{Fan_2025_ICCV,
    author    = {Fan, Ke and Lu, Shunlin and Dai, Minyue and Yu, Runyi and Xiao, Lixing and Dou, Zhiyang and Dong, Junting and Ma, Lizhuang and Wang, Jingbo},
    title     = {Go to Zero: Towards Zero-shot Motion Generation with Million-scale Data},
    booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
    month     = {October},
    year      = {2025},
    pages     = {13336-13348}
}

Star History

Star History Chart

About

The modified 272-dimensional motion representation processing script.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages