Skip to content

exii-uw/gait-gestures

Repository files navigation

Gait-Gestures Dataset

This repository contains supplementary materials for the UIST 2024 paper:

Ching-Yi Tsai, Ryan Yen, Daekun Kim, and Daniel Vogel.
Gait Gestures: Examining Stride and Foot Strike Variation as an Input Method While Walking.
UIST 2024 Paper

Project Structure

  • data.csv – Contains movement data logs from the formative experiment (available in this Kaggle dataset link).
  • raw_data_processor.ipynb – Jupyter notebook for processing the collected data, which will align the walking direction of walking data.
  • stride_segmentation.ipynb – Contains scripts and notebooks for stride segmentation.

Data Logging in data.csv

Each row in the dataset follows the format:
<Time>, <Type>, <Gesture>, <Foot>, <Direction>, <Checkpoint>, <TrackerName>, <pos_x>, <pos_y>, <pos_z>, <ang_x>, <ang_y>, <ang_z>, <UserID>

Example:
1675970093690, I, FAST-STEP, leftfoot, RecordingBack, StartPoint, TrackerR, -0.14286, 0.16908, -5.64499, 313.58035, 252.02341, 332.307600, User26

Key Data Fields

Event Time and Type

  • <Time>: Event Timestamp (milliseconds since Unix Epoch, unique per participant).
  • <Type>: Event Type, represented by:
    • I – Captures a frame of input data (e.g., movement logs).
    • E – Marks an experimental event (e.g., passing a checkpoint).
    • If Type = E, the row contains only experimental conditions, such as:
      1676319280872, E, NORMAL, bothfoot, RecordingBack, MiddlePoint2,,,,,,,,User01
      

Gait Gesture and Walking Context

  • <Gesture>: Performed Gait Gesture from the predefined list:

    NORMAL, SMALL-STEP, TAP-ROTATE-IN, KICK-AHEAD, TAP-ACROSS, SLOW-STEP, TAP-IN, TAP-ROTATE-OUT, TAP-AHEAD, DRAG-AHEAD, DRAG-BEHIND, TAP-OUT, SWING-OUT, KICK-OUT, KICK-IN, FAST-STEP, HEEL, HIGH-STEP, TAP-BEHIND, SWING-IN, BRUSH, BIG-STEP
    
  • <Foot>: Foot Used (leftfoot, rightfoot, or bothfoot).

    • bothfoot appears only in normal walking trials with no specific gesture foot.
  • <Direction>: Walking Direction (RecordingForth or RecordingBack), indicating the participant’s walking phase in the study.

  • <Checkpoint>: Walking Path Checkpoint, marking the last checkpoint the participant passed:

    • StartPoint, MiddlePoint1, MiddlePoint2, EndPoint
    • (See figure below for experimental walking path and checkpoint locations.)
    • Walking Path
  • <UserID>: Unique Participant ID.

Tracking and Motion Data

  • <TrackerName>: Tracking Data Source, indicating the sensor origin:
    • TrackerH – Raw data from helmet tracker
    • TrackerL – Raw data from left foot tracker
    • TrackerR – Raw data from right foot tracker
    • Head – Calibrated helmet tracker data (adjusted for head size)
    • FootL – Calibrated left foot data (adjusted for shoe size)
    • FootR – Calibrated right foot data (adjusted for shoe size)
  • <pos_x>, <pos_y>, <pos_z>: Positional Data (Unity coordinates: x = right, y = up, z = forward).
  • <ang_x>, <ang_y>, <ang_z>: Rotational Data (Unity Euler angles).

Contact Information

For further inquiries, please reach out to:
📩 Ching-Yi Tsaiching-yi@princeton.edu
📩 Ryan Yenryanyen2@mit.edu

Bibtex and Citation

@inproceedings{10.1145/3654777.3676342,
author = {Tsai, Ching-Yi and Yen, Ryan and Kim, Daekun and Vogel, Daniel},
title = {Gait Gestures: Examining Stride and Foot Strike Variation as an Input Method While Walking},
year = {2024},
isbn = {9798400706288},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3654777.3676342},
doi = {10.1145/3654777.3676342},
abstract = {Walking is a cyclic pattern of alternating footstep strikes, with each pair of steps forming a stride, and a series of strides forming a gait. We conduct a systematic examination of different kinds of intentional variations from a normal gait that could be used as input actions without interrupting overall walking progress. A design space of 22 candidate Gait Gestures is generated by adapting previous standing foot input actions and identifying new actions possible in a walking context. A formative study (n=25) examines movement easiness, social acceptability, and walking compatibility with foot movement logging to calculate temporal and spatial characteristics. Using a categorization of these results, 7 gestures are selected for a wizard-of-oz prototype demonstrating an AR interface controlled by Gait Gestures for ordering food and audio playback while walking. As a technical proof-of-concept, a gait gesture recognizer is developed and tested using the formative study data.},
booktitle = {Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology},
articleno = {68},
numpages = {16},
keywords = {foot-based gesture, interaction technique, mixed reality, walking},
location = {Pittsburgh, PA, USA},
series = {UIST '24}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors