This repository contains supplementary materials for the UIST 2024 paper:
Ching-Yi Tsai, Ryan Yen, Daekun Kim, and Daniel Vogel.
Gait Gestures: Examining Stride and Foot Strike Variation as an Input Method While Walking.
UIST 2024 Paper
data.csv– Contains movement data logs from the formative experiment (available in this Kaggle dataset link).raw_data_processor.ipynb– Jupyter notebook for processing the collected data, which will align the walking direction of walking data.stride_segmentation.ipynb– Contains scripts and notebooks for stride segmentation.
Each row in the dataset follows the format:
<Time>, <Type>, <Gesture>, <Foot>, <Direction>, <Checkpoint>, <TrackerName>, <pos_x>, <pos_y>, <pos_z>, <ang_x>, <ang_y>, <ang_z>, <UserID>
Example:
1675970093690, I, FAST-STEP, leftfoot, RecordingBack, StartPoint, TrackerR, -0.14286, 0.16908, -5.64499, 313.58035, 252.02341, 332.307600, User26
<Time>: Event Timestamp (milliseconds since Unix Epoch, unique per participant).<Type>: Event Type, represented by:I– Captures a frame of input data (e.g., movement logs).E– Marks an experimental event (e.g., passing a checkpoint).- If
Type = E, the row contains only experimental conditions, such as:1676319280872, E, NORMAL, bothfoot, RecordingBack, MiddlePoint2,,,,,,,,User01
-
<Gesture>: Performed Gait Gesture from the predefined list:NORMAL, SMALL-STEP, TAP-ROTATE-IN, KICK-AHEAD, TAP-ACROSS, SLOW-STEP, TAP-IN, TAP-ROTATE-OUT, TAP-AHEAD, DRAG-AHEAD, DRAG-BEHIND, TAP-OUT, SWING-OUT, KICK-OUT, KICK-IN, FAST-STEP, HEEL, HIGH-STEP, TAP-BEHIND, SWING-IN, BRUSH, BIG-STEP -
<Foot>: Foot Used (leftfoot,rightfoot, orbothfoot).bothfootappears only in normal walking trials with no specific gesture foot.
-
<Direction>: Walking Direction (RecordingForthorRecordingBack), indicating the participant’s walking phase in the study. -
<Checkpoint>: Walking Path Checkpoint, marking the last checkpoint the participant passed: -
<UserID>: Unique Participant ID.
<TrackerName>: Tracking Data Source, indicating the sensor origin:TrackerH– Raw data from helmet trackerTrackerL– Raw data from left foot trackerTrackerR– Raw data from right foot trackerHead– Calibrated helmet tracker data (adjusted for head size)FootL– Calibrated left foot data (adjusted for shoe size)FootR– Calibrated right foot data (adjusted for shoe size)
<pos_x>, <pos_y>, <pos_z>: Positional Data (Unity coordinates:x= right,y= up,z= forward).<ang_x>, <ang_y>, <ang_z>: Rotational Data (Unity Euler angles).
For further inquiries, please reach out to:
📩 Ching-Yi Tsai – ching-yi@princeton.edu
📩 Ryan Yen – ryanyen2@mit.edu
@inproceedings{10.1145/3654777.3676342,
author = {Tsai, Ching-Yi and Yen, Ryan and Kim, Daekun and Vogel, Daniel},
title = {Gait Gestures: Examining Stride and Foot Strike Variation as an Input Method While Walking},
year = {2024},
isbn = {9798400706288},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3654777.3676342},
doi = {10.1145/3654777.3676342},
abstract = {Walking is a cyclic pattern of alternating footstep strikes, with each pair of steps forming a stride, and a series of strides forming a gait. We conduct a systematic examination of different kinds of intentional variations from a normal gait that could be used as input actions without interrupting overall walking progress. A design space of 22 candidate Gait Gestures is generated by adapting previous standing foot input actions and identifying new actions possible in a walking context. A formative study (n=25) examines movement easiness, social acceptability, and walking compatibility with foot movement logging to calculate temporal and spatial characteristics. Using a categorization of these results, 7 gestures are selected for a wizard-of-oz prototype demonstrating an AR interface controlled by Gait Gestures for ordering food and audio playback while walking. As a technical proof-of-concept, a gait gesture recognizer is developed and tested using the formative study data.},
booktitle = {Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology},
articleno = {68},
numpages = {16},
keywords = {foot-based gesture, interaction technique, mixed reality, walking},
location = {Pittsburgh, PA, USA},
series = {UIST '24}
}
