Skip to content

[TPAMI 2026] Code for paper "3D Hand Pose Estimation via Articulated Anchor-to-Joint 3D Local Regressors"

License

Notifications You must be signed in to change notification settings

ChanglongJiangGit/A2J-Transformer-Plus

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

9 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

[TPAMI] 3D Hand Pose Estimation via Articulated Anchor-to-Joint 3D Local Regressors

Changlong Jiang, Yang Xiao, Jinghong Zheng, Haohong Kuang, Cunlin Wu, Mingyang Zhang, Zhiguo Cao, Min Du, Joey Tianyi Zhou, and Junsong Yuan

TPAMI 2026 Paper Porject Framework License


Visualization of anchor-to-joint offsets and weights.


πŸ“– Introduction

This repository is the official implementation for the paper "3D Hand Pose Estimation via Articulated Anchor-to-Joint 3D Local Regressors", published in TPAMI 2026.

This paper is an extension of our CVPR 2023 work, "A2J-Transformer: Anchor-to-Joint Transformer Network for 3D Interacting Hand Pose Estimation from a Single RGB Image".


[Left] Articulate anchors through Transformer. [Right] Anchor-to-anchor self-attention visualization. The articulated anchors are aware of hands' relationship, allowing local anchor points to be aware of hands' global information.


πŸ”₯ News & Updates

  • (2026-1-6): Source code (training and evaluation) is now open!

πŸ› οΈ Installation

The installation procedure follows the same steps as A2J-Transformer.

1. Environment Setup

The code has been tested on Ubuntu 20.04 with NVIDIA 2080Ti/3090 GPUs. It is compatible with both PyTorch 1.7 and 1.11.

# Create conda environment with Python>=3.7.
conda create --name a2j_trans python=3.7
conda activate a2j_trans

# Install PyTorch>=1.7.1, torchvision>=0.8.2 (Recommended: v1.7.1).
conda install pytorch==1.7.1 torchvision==0.8.2 torchaudio==0.7.2 cudatoolkit=11.0 -c pytorch

# Install dependencies.
conda install tqdm numpy matplotlib scipy
pip install opencv-python pycocotools

2. Compile CUDA Operators

We follow Deformable-DETR operators. Please compile them as follows:

cd ./dab_deformable_detr/ops
sh make.sh

πŸ“‚ Dataset Preparation

Please download the datasets from their official sources:


πŸš€ Usage

Note: For all experiments, you need to modify the config.py file to set the cur_dir to your absolute project path.

1. File Organization

First, download our trained checkpoints:

Then, organize your project directory as follows:

A2J-Transformer-Plus/
β”œβ”€β”€ common/
β”œβ”€β”€ data/
β”œβ”€β”€ main/
└── output/
    β”œβ”€β”€ rhd/
    β”‚   └── model_dump/
    β”‚       └── snapshot_0.pth.tar  <-- Place RHD checkpoint here
    └── ho3d/
        └── model_dump/
            └── snapshot_1.pth.tar  <-- Place HO3D checkpoint here

2. Evaluation

RHD Dataset

  1. In config.py, set:
    • dataset = 'rhd'
    • rhd_root_dir and rootnet_output_path to your dataset paths.
  2. Run:
    cd main
    python test.py --gpu <your_gpu_ids> --test_epoch 0

HO-3D V2 Dataset

  1. In config.py, set:
    • dataset = 'ho3d'
    • ho3d_anno_dir, ho3d_seg_dir, skeleton_file and obj_kps_dir to your dataset paths.
  2. Run:
    cd main
    python test.py --gpu <your_gpu_ids> --test_epoch 1

3. Training

RHD Dataset

  1. Ensure config.py is configured for RHD (see Evaluation section).
  2. Run:
    cd main
    python train.py --gpu <your_gpu_ids>

HO-3D V2 Dataset

  1. Ensure config.py is configured for HO3D (see Evaluation section).
  2. Run:
    cd main
    python train.py --gpu <your_gpu_ids>

InterHand2.6M

  1. In config.py, set:
    • dataset = 'InterHand2.6M'
    • interhand_anno_dir and interhand_images_path to your dataset paths.
  2. Run:
    cd main
    python train.py --gpu <your_gpu_ids>

Tips : The continue parameter in parse_args determines whether to continue training from the last epoch.


⚠️ License & Contact

IMPORTANT: Our code is protected by patents and cannot be used for commercial purposes.

If you have commercial needs, please contact Prof. Yang Xiao (Huazhong University of Science and Technology) at Yang_Xiao@hust.edu.cn.


πŸ–ŠοΈ Citation

If you find our work useful in your research, please cite:

@article{jiang20253d,
  title={3D Hand Pose Estimation via Articulated Anchor-to-Joint 3D Local Regressors},
  author={Jiang, Changlong and Xiao, Yang and Zheng, Jinghong and Kuang, Haohong and Wu, Cunlin and Zhang, Mingyang and Cao, Zhiguo and Du, Min and Zhou, Joey Tianyi and Yuan, Junsong},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2025},
  publisher={IEEE},
  doi={10.1109/TPAMI.2025.3609907}
}

About

[TPAMI 2026] Code for paper "3D Hand Pose Estimation via Articulated Anchor-to-Joint 3D Local Regressors"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors