Skip to content

ataga101/any-to-motion-mamm

Repository files navigation

Official Implementation of MAMM: Motion Control via Metric-Aligning Motion Matching

SIGGRAPH 2025 (Conference Track) Project Page: https://ataga101.github.io/mamm-project-page/

Abstract

We introduce a novel method for controlling a motion sequence using an arbitrary temporal control sequence using temporal alignment. Temporal alignment of motion has gained significant attention owing to its applications in motion control and retargeting. Traditional methods rely on either learned or hand-craft cross-domain mappings between frames in the original and control domains, which often require large, paired, or annotated datasets and time-consuming training. Our approach, named Metric-Aligning Motion Matching, achieves alignment by solely considering within-domain distances. It computes distances among patches in each domain and seeks a matching that optimally aligns the two within-domain distances. This framework allows for the alignment of a motion sequence to various types of control sequences, including sketches, labels, audio, and another motion sequence, all without the need for manually defined mappings or training with annotated data. We demonstrate the effectiveness of our approach through applications in efficient motion control, showcasing its potential in practical scenarios.

Code

We have the following applications in this repository:

  • (1) waveform-to-motion: control motion using waveform
  • (2) sketch-to-motion: control motion using a sketch
  • (3) motion-by-numbers (label-to-motion): control motion using labels
  • (4) motion-to-motion alignment: control motion using another motion sequence
  • (5a) speech-to-motion: control motion using speech
  • (5b) music-to-motion: control motion using music

Installation

You must first install PyTorch if you plan to use a GPU to accelerate computation. You can install PyTorch by following the instructions on the official website: https://pytorch.org/get-started/locally/

Then, install the required packages by running:

# pip install bpy (optional)
pip install -r requirements.txt

We tested our code on Python 3.11.

We also provide Windows/Mac executable files for the sketch-to-motion application. Use at your own risk.

How to run

The results of the applications are saved in the output directory except sketch-to-motion. You can change the destination by --output_dir option.

1. waveform-to-motion

# Example: --curve_generator_name "generate_frequency_increasing_sin_curve" --curve_generator_args "fps=30, amplitude=1., phase=0., freq_end=0.75, freq_start=1.5"
# Please check the available curve generators in mamm/data_utils/waveform/gen_curve.py 
python scripts/run_waveform_to_motion.py -t <original motion bvh file path> --curve_generator_name <curve generator name> --curve_generator_args <curve generator args>

2. sketch-to-motion

As mentioned above, we provide Windows/Mac executable files for the sketch-to-motion application. You can download the executable files from the link above.

If you want to run the sketch-to-motion application from the source code, run:

python mamm/web_server/app.py

3. label-to-motion

# Example: --segmentation_pos "0.0, 0.25, 0.5, 0.75" --segmentation_label "0, 1, 0, 1"
python scripts/run_label_to_motion.py -t <original motion bvh file path> --segmentation_pos <segmentation position> --segmentation_label <segmentation label>

4. motion-to-motion alignment

python scripts/run_motion_retargeting.py -t <original motion bvh file path> -s <control motion bvh file path>

5a. speech-to-motion

python scripts/audio/run_speech_to_motion.py -t <original motion bvh file path> -s <audio file path>

5b. music-to-motion

python scripts/audio/run_music_to_motion.py -t <original motion bvh file path> -s <audio file path>

Citation

@inproceedings{10.1145/3721238.3730665,
author = {Agata, Naoki and Igarashi, Takeo},
title = {Motion Control via Metric-Aligning Motion Matching},
year = {2025},
isbn = {9798400715402},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3721238.3730665},
doi = {10.1145/3721238.3730665},
booktitle = {Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers},
articleno = {11},
numpages = {12},
keywords = {character animation, motion control, motion editing interface, optimal transport},
location = {
},
series = {SIGGRAPH Conference Papers '25}
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

Part of the code is based on the following projects: GenMM, POT, Holden et al. (2016), etc.

About

Official Implementation of MAMM: Motion Control via Metric-Aligning Motion Matching

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published