SIGGRAPH 2025 (Conference Track) Project Page: https://ataga101.github.io/mamm-project-page/
We introduce a novel method for controlling a motion sequence using an arbitrary temporal control sequence using temporal alignment. Temporal alignment of motion has gained significant attention owing to its applications in motion control and retargeting. Traditional methods rely on either learned or hand-craft cross-domain mappings between frames in the original and control domains, which often require large, paired, or annotated datasets and time-consuming training. Our approach, named Metric-Aligning Motion Matching, achieves alignment by solely considering within-domain distances. It computes distances among patches in each domain and seeks a matching that optimally aligns the two within-domain distances. This framework allows for the alignment of a motion sequence to various types of control sequences, including sketches, labels, audio, and another motion sequence, all without the need for manually defined mappings or training with annotated data. We demonstrate the effectiveness of our approach through applications in efficient motion control, showcasing its potential in practical scenarios.
We have the following applications in this repository:
- (1)
waveform-to-motion: control motion using waveform - (2)
sketch-to-motion: control motion using a sketch - (3)
motion-by-numbers (label-to-motion): control motion using labels - (4)
motion-to-motion alignment: control motion using another motion sequence - (5a)
speech-to-motion: control motion using speech - (5b)
music-to-motion: control motion using music
You must first install PyTorch if you plan to use a GPU to accelerate computation. You can install PyTorch by following the instructions on the official website: https://pytorch.org/get-started/locally/
Then, install the required packages by running:
# pip install bpy (optional)
pip install -r requirements.txtWe tested our code on Python 3.11.
We also provide Windows/Mac executable files for the sketch-to-motion application. Use at your own risk.
The results of the applications are saved in the output directory except sketch-to-motion. You can change the destination by --output_dir option.
# Example: --curve_generator_name "generate_frequency_increasing_sin_curve" --curve_generator_args "fps=30, amplitude=1., phase=0., freq_end=0.75, freq_start=1.5"
# Please check the available curve generators in mamm/data_utils/waveform/gen_curve.py
python scripts/run_waveform_to_motion.py -t <original motion bvh file path> --curve_generator_name <curve generator name> --curve_generator_args <curve generator args>As mentioned above, we provide Windows/Mac executable files for the sketch-to-motion application. You can download the executable files from the link above.
If you want to run the sketch-to-motion application from the source code, run:
python mamm/web_server/app.py# Example: --segmentation_pos "0.0, 0.25, 0.5, 0.75" --segmentation_label "0, 1, 0, 1"
python scripts/run_label_to_motion.py -t <original motion bvh file path> --segmentation_pos <segmentation position> --segmentation_label <segmentation label>python scripts/run_motion_retargeting.py -t <original motion bvh file path> -s <control motion bvh file path>python scripts/audio/run_speech_to_motion.py -t <original motion bvh file path> -s <audio file path>python scripts/audio/run_music_to_motion.py -t <original motion bvh file path> -s <audio file path>@inproceedings{10.1145/3721238.3730665,
author = {Agata, Naoki and Igarashi, Takeo},
title = {Motion Control via Metric-Aligning Motion Matching},
year = {2025},
isbn = {9798400715402},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3721238.3730665},
doi = {10.1145/3721238.3730665},
booktitle = {Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers},
articleno = {11},
numpages = {12},
keywords = {character animation, motion control, motion editing interface, optimal transport},
location = {
},
series = {SIGGRAPH Conference Papers '25}
}
This project is licensed under the MIT License - see the LICENSE file for details.
Part of the code is based on the following projects: GenMM, POT, Holden et al. (2016), etc.