Skip to content

A flexible and extensible framework for gait recognition. You can focus on designing your own models and comparing with state-of-the-arts easily with the help of OpenGait.

Notifications You must be signed in to change notification settings

ShiqiYu/OpenGait

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

nmbgcl

OpenGait is a flexible and extensible gait analysis project provided by the Shiqi Yu Group and supported in part by WATRIX.AI. The corresponding paper has been accepted by CVPR2023 as a highlight paper. The extension paper has been accepted to TPAMI2025.

What's New

  • [Jun 2025] The extension paper of OpenGait, further strengthened by the advancements of DeepGaitV2, SkeletonGait, and SkeletonGait++, has been accepted for publication in TPAMI🎉. We sincerely acknowledge the valuable contributions and continuous support from the OpenGait community.
  • [Feb 2025] The diffusion-based DenoisingGait has been accepted to CVPR2025🎉 Congratulations to Dongyang! This is his SECOND paper!
  • [Feb 2025] Chao successfully defended his Ph.D. thesis in Oct. 2024🎉🎉🎉 You can access the full text in Chao's Thesis in English or 樊超的学位论文(中文版).
  • [Dec 2024] The multimodal MultiGait++ has been accepted to AAAI2025🎉 Congratulations to Dongyang! This is his FIRST paper!
  • [Jun 2024] The first large-scale gait-based scoliosis screening benchmark ScoNet is accepted to MICCAI2024🎉 Congratulations to Zirui! This is his FIRST paper! The code is released here, and you can refer to project homepage for details.
  • [May 2024] The code of Large Vision Model based method BigGait is available at here. CCPG's checkpoints.
  • [Apr 2024] Our team's latest checkpoints for projects such as DeepGaitv2, SkeletonGait, SkeletonGait++, and SwinGait will be released on Hugging Face. Additionally, previously released checkpoints will also be gradually made available on it.
  • [Mar 2024] Chao gives a talk about 'Progress in Gait Recognition'. The video and slides are both available😊
  • [Mar 2024] The code of SkeletonGait++ is released here, and you can refer to readme for details.
  • [Mar 2024] BigGait has been accepted to CVPR2024🎉 Congratulations to Dingqiang! This is his FIRST paper!
  • [Jan 2024] The code of transfomer-based SwinGait is available at here.

Our Works

  • [TPAMI'25] OpenGait: A Comprehensive Benchmark Study for Gait Recognition Towards Better Practicality. Paper. This extension includes a key update with in-depth insights into emerging trends and challenges of gait recognition in Sec. VII.
  • [CVPR'25] On Denoising Walking Videos for Gait Recognition. Paper and [DenoisingGait Code (coming soon)]
  • [Chao's Thesis] Gait Representation Learning and Recognition, Chinese Original and English Translation.
  • [AAAI'25] Exploring More from Multiple Gait Modalities for Human Identification, Paper and MultiGait++ Code.
  • [TBIOM'24] A Comprehensive Survey on Deep Gait Recognition: Algorithms, Datasets, and Challenges, Survey Paper.
  • [MICCAI'24] Gait Patterns as Biomarkers: A Video-Based Approach for Classifying Scoliosis, Paper, Dataset, and ScoNet Code.
  • [CVPR'24] BigGait: Learning Gait Representation You Want by Large Vision Models. Paper, and BigGait Code.
  • [AAAI'24] SkeletonGait++: Gait Recognition Using Skeleton Maps. Paper, and SkeletonGait++ Code.
  • [AAAI'24] Cross-Covariate Gait Recognition: A Benchmark. Paper, CCGR Dataset, and ParsingGait Code.
  • [Arxiv'23] Exploring Deep Models for Practical Gait Recognition. Paper, DeepGaitV2 Code, and SwinGait Code.
  • [TPAMI'23] Learning Gait Representation from Massive Unlabelled Walking Videos: A Benchmark, Paper, GaitLU-1M Dataset, and GaitSSB Code.
  • [CVPR'23] LidarGait: Benchmarking 3D Gait Recognition with Point Clouds, Paper, SUSTech1K Dataset and LidarGait Code.
  • [CVPR'23] OpenGait: Revisiting Gait Recognition Toward Better Practicality, Highlight Paper, and GaitBase Code.
  • [ECCV'22] GaitEdge: Beyond Plain End-to-end Gait Recognition for Better Practicality, Paper, and GaitEdge Code.

A Real Gait Recognition System: All-in-One-Gait

probe1-After

The workflow of All-in-One-Gait involves the processes of pedestrian tracking, segmentation and recognition. See here for details.

Highlighted features

Getting Started

Please see 0.get_started.md. We also provide the following tutorials for your reference:

Model Zoo

✨✨✨You can find all the checkpoint files at Hugging Face Models✨✨✨!

The result list of appearance-based gait recognition is available here.

The result list of pose-based gait recognition is available here.

Authors:

Now OpenGait is mainly maintained by Dongyang Jin (金冬阳), [email protected]

Acknowledgement

Citation

@InProceedings{Fan_2023_CVPR,
    author    = {Fan, Chao and Liang, Junhao and Shen, Chuanfu and Hou, Saihui and Huang, Yongzhen and Yu, Shiqi},
    title     = {OpenGait: Revisiting Gait Recognition Towards Better Practicality},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {9707-9716}
}

@ARTICLE{fan2025opengait,
  author={Fan, Chao and Hou, Saihui and Liang, Junhao and Shen, Chuanfu and Ma, Jingzhe and Jin, Dongyang and Huang, Yongzhen and Yu, Shiqi},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
  title={OpenGait: A Comprehensive Benchmark Study for Gait Recognition Towards Better Practicality}, 
  year={2025},
  volume={},
  number={},
  pages={1-18},
  doi={10.1109/TPAMI.2025.3576283}
}

Note: This code is only used for academic purposes, people cannot use this code for anything that might be considered commercial use.

About

A flexible and extensible framework for gait recognition. You can focus on designing your own models and comparing with state-of-the-arts easily with the help of OpenGait.

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 15