skip to main content
10.1145/3654777.3676342acmotherconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article
Open access

Gait Gestures: Examining Stride and Foot Strike Variation as an Input Method While Walking

Published: 11 October 2024 Publication History

Abstract

Walking is a cyclic pattern of alternating footstep strikes, with each pair of steps forming a stride, and a series of strides forming a gait. We conduct a systematic examination of different kinds of intentional variations from a normal gait that could be used as input actions without interrupting overall walking progress. A design space of 22 candidate Gait Gestures is generated by adapting previous standing foot input actions and identifying new actions possible in a walking context. A formative study (n=25) examines movement easiness, social acceptability, and walking compatibility with foot movement logging to calculate temporal and spatial characteristics. Using a categorization of these results, 7 gestures are selected for a wizard-of-oz prototype demonstrating an AR interface controlled by Gait Gestures for ordering food and audio playback while walking. As a technical proof-of-concept, a gait gesture recognizer is developed and tested using the formative study data.

Supplemental Material

MP4 File
Video figure
ZIP File
All Gait Gestures Video Demonstration.

References

[1]
Alireza Abdoli, Sara Alaee, Shima Imani, Amy Murillo, Alec Gerry, Leslie Hickle, and Eamonn Keogh. 2020. Fitbit for Chickens? Time Series Data Mining Can Increase the Productivity of Poultry Farms. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (Virtual Event, CA, USA) (KDD ’20). Association for Computing Machinery, New York, NY, USA, 3328–3336. https://doi.org/10.1145/3394486.3403385
[2]
Euijai Ahn and Gerard J. Kim. 2013. Casual Video Watching during Sensor Guided Navigation. In Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry (Hong Kong, Hong Kong) (VRCAI ’13). Association for Computing Machinery, New York, NY, USA, 275–278. https://doi.org/10.1145/2534329.2534378
[3]
Jason Alexander, Teng Han, William Judd, Pourang Irani, and Sriram Subramanian. 2012. Putting Your Best Foot Forward: Investigating Real-World Mappings for Foot-Based Gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Austin, Texas, USA) (CHI ’12). Association for Computing Machinery, New York, NY, USA, 1229–1238. https://doi.org/10.1145/2207676.2208575
[4]
Yangyi Ang, Puteri Suhaiza Sulaiman, Rahmita Wirza O. K. Rahmat, and Noris Mohd Norowi. 2019. Swing-In-Place (SIP): A Less Fatigue Walking-in-Place Method With Side-Viewing Functionality for Mobile Virtual Reality. IEEE Access 7 (2019), 183985–183995. https://doi.org/10.1109/ACCESS.2019.2960409
[5]
Thomas Augsten, Konstantin Kaefer, René Meusel, Caroline Fetzer, Dorian Kanitz, Thomas Stoff, Torsten Becker, Christian Holz, and Patrick Baudisch. 2010. Multitoe: high-precision interaction with back-projected floors based on high-resolution multi-touch input. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology (New York, New York, USA) (UIST ’10). Association for Computing Machinery, New York, NY, USA, 209–218. https://doi.org/10.1145/1866029.1866064
[6]
Aaron Bangor, Philip Kortum, and James Miller. 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies 4, 3 (2009), 114–123.
[7]
Ralph M Barnes, Henry Hardaway, and Odif Podolsky. 1942. Which pedal is best. Factory Management and Maintenance 100, 98 (1942), 267.
[8]
David R Bassett Jr, Holly R Wyatt, Helen Thompson, John C Peters, and James O Hill. 2010. Pedometer-measured physical activity and health behaviors in United States adults. Medicine and science in sports and exercise 42, 10 (2010), 1819.
[9]
Andrew Crossan, Stephen Brewster, and Alexander Ng. 2010. Foot Tapping for Mobile Interaction. In Proceedings of the 24th BCS Interaction Specialist Group Conference (Dundee, United Kingdom) (BCS ’10). BCS Learning & Development Ltd., Swindon, GBR, 418–422.
[10]
Yasmin Felberbaum and Joel Lanir. 2018. Better Understanding of Foot Gestures: An Elicitation Study. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3173908
[11]
Dustin Freeman, Hrvoje Benko, Meredith Ringel Morris, and Daniel Wigdor. 2009. ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces (Banff, Alberta, Canada) (ITS ’09). Association for Computing Machinery, New York, NY, USA, 165–172. https://doi.org/10.1145/1731903.1731935
[12]
Mayank Goel, Leah Findlater, and Jacob Wobbrock. 2012. WalkType: Using Accelerometer Data to Accomodate Situational Impairments in Mobile Touch Screen Text Entry. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Austin, Texas, USA) (CHI ’12). Association for Computing Machinery, New York, NY, USA, 2687–2696. https://doi.org/10.1145/2207676.2208662
[13]
Teng Han, Jason Alexander, Abhijit Karnik, Pourang Irani, and Sriram Subramanian. 2011. Kick: Investigating the Use of Kick Gestures for Mobile Interactions. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services (Stockholm, Sweden) (MobileHCI ’11). Association for Computing Machinery, New York, NY, USA, 29–32. https://doi.org/10.1145/2037373.2037379
[14]
Juan David Hincapié-Ramos and Pourang Irani. 2013. CrashAlert: Enhancing Peripheral Alertness for Eyes-Busy Mobile Interaction While Walking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Paris, France) (CHI ’13). Association for Computing Machinery, New York, NY, USA, 3385–3388. https://doi.org/10.1145/2470654.2466463
[15]
Wen-jun Hou, Kai-xiang Chen, Hao Li, and Hu Zhou. 2018. User Defined Eye Movement-Based Interaction for Virtual Reality. In Cross-Cultural Design. Methods, Tools, and Users, Pei-Luen Patrick Rau (Ed.). Springer International Publishing, Cham, 18–30.
[16]
Dong-Hyun Hwang, Kohei Aso, Ye Yuan, Kris Kitani, and Hideki Koike. 2020. MonoEye: Multimodal Human Motion Capture System Using A Single Ultra-Wide Fisheye Camera. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (Virtual Event, USA) (UIST ’20). Association for Computing Machinery, New York, NY, USA, 98–111. https://doi.org/10.1145/3379337.3415856
[17]
Sampath Jayalath and Nimsiri Abhayasinghe. 2013. A gyroscopic data based pedometer algorithm. In 2013 8th International Conference on Computer Science & Education. IEEE, IEEE, New York, NY, USA, 551–555.
[18]
Arata Jingu, Yudai Tanaka, and Pedro Lopes. 2023. LipIO: Enabling Lips as both Input and Output Surface. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (Hamburg, Germany) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 695, 14 pages. https://doi.org/10.1145/3544548.3580775
[19]
R. Kadobayashi, K. Nishimoto, and K. Mase. 1998. Design and evaluation of gesture interface of an immersive walk-through application for exploring cyberspace. In Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition. IEEE, New York, NY, USA, 534–539. https://doi.org/10.1109/AFGR.1998.671003
[20]
Shaun K. Kane, Jacob O. Wobbrock, and Ian E. Smith. 2008. Getting off the Treadmill: Evaluating Walking User Interfaces for Mobile Devices in Public Spaces. In Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services (Amsterdam, The Netherlands) (MobileHCI ’08). Association for Computing Machinery, New York, NY, USA, 109–118. https://doi.org/10.1145/1409240.1409253
[21]
Xiaomin Kang, Baoqi Huang, and Guodong Qi. 2018. A novel walking detection and step counting algorithm using unconstrained smartphones. Sensors 18, 1 (2018), 297.
[22]
Pinachuan Ke and Kenina Zhu. 2021. Larger Step Faster Speed: Investigating Gesture-Amplitude-based Locomotion in Place with Different Virtual Walking Speed in Virtual Reality. In 2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE, New York, NY, USA, 438–447. https://doi.org/10.1109/VR50410.2021.00067
[23]
Ashutosh Kharb, Vipin Saini, YK Jain, and Surender Dhiman. 2011. A review of gait cycle and its parameters. IJCEM International Journal of Computational Engineering & Management 13, 01 (2011).
[24]
Woojoo Kim and Shuping Xiong. 2021. User-defined walking-in-place gestures for VR locomotion. International Journal of Human-Computer Studies 152 (2021), 102648. https://doi.org/10.1016/j.ijhcs.2021.102648
[25]
Naoki Kimura, Tan Gemicioglu, Jonathan Womack, Richard Li, Yuhui Zhao, Abdelkareem Bedri, Zixiong Su, Alex Olwal, Jun Rekimoto, and Thad Starner. 2022. SilentSpeller: Towards mobile, hands-free, silent speech text entry using electropalatography. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 288, 19 pages. https://doi.org/10.1145/3491102.3502015
[26]
Oleg V. Komogortsev, Young Sam Ryu, Do Hyong Koh, and Sandeep M. Gowda. 2009. Instantaneous saccade driven eye gaze interaction. In Proceedings of the International Conference on Advances in Computer Entertainment Technology (Athens, Greece) (ACE ’09). Association for Computing Machinery, New York, NY, USA, 140–147. https://doi.org/10.1145/1690388.1690412
[27]
Abhishek Kumar, Lik-Hang Lee, Jagmohan Chauhan, Xiang Su, Mohammad A. Hoque, Susanna Pirttikangas, Sasu Tarkoma, and Pan Hui. 2022. PassWalk: Spatial Authentication Leveraging Lateral Shift and Gaze on Mobile Headsets. In Proceedings of the 30th ACM International Conference on Multimedia (Lisboa, Portugal) (MM ’22). Association for Computing Machinery, New York, NY, USA, 952–960. https://doi.org/10.1145/3503161.3548252
[28]
Wallace S. Lages and Doug A. Bowman. 2019. Walking with Adaptive Augmented Reality Workspaces: Design and Usage Patterns. In Proceedings of the 24th International Conference on Intelligent User Interfaces (Marina del Ray, California) (IUI ’19). Association for Computing Machinery, New York, NY, USA, 356–366. https://doi.org/10.1145/3301275.3302278
[29]
Yi-Chi Liao, Yen-Chiu Chen, Liwei Chan, and Bing-Yu Chen. 2017. Dwell+: Multi-Level Mode Selection Using Vibrotactile Cues. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (Québec City, QC, Canada) (UIST ’17). Association for Computing Machinery, New York, NY, USA, 5–16. https://doi.org/10.1145/3126594.3126627
[30]
Yuxuan Liu, Jianxin Yang, Xiao Gu, Yijun Chen, Yao Guo, and Guang-Zhong Yang. 2023. EgoFish3D: Egocentric 3D Pose Estimation From a Fisheye Camera via Self-Supervised Learning. IEEE Transactions on Multimedia 25 (2023), 8880–8891. https://doi.org/10.1109/TMM.2023.3242551
[31]
Andrés Lucero and Akos Vetek. 2014. NotifEye: Using Interactive Glasses to Deal with Notifications While Walking in Public. In Proceedings of the 11th Conference on Advances in Computer Entertainment Technology (Funchal, Portugal) (ACE ’14). Association for Computing Machinery, New York, NY, USA, Article 17, 10 pages. https://doi.org/10.1145/2663806.2663824
[32]
George A Miller. 1956. The magical number seven, plus or minus two: Some limits on our capacity for processing information.Psychological review 63, 2 (1956), 81.
[33]
Abdullah Mueen, Sheng Zhong, Yan Zhu, Michael Yeh, Kaveh Kamgar, Krishnamurthy Viswanathan, Chetan Gupta, and Eamonn Keogh. 2022. The Fastest Similarity Search Algorithm for Time Series Subsequences under Euclidean Distance. http://www.cs.unm.edu/ mueen/FastestSimilaritySearch.html.
[34]
Florian Müller, Joshua McManus, Sebastian Günther, Martin Schmitz, Max Mühlhäuser, and Markus Funk. 2019. Mind the Tap: Assessing Foot-Taps for Interacting with Head-Mounted Displays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300707
[35]
Florian Müller, Daniel Schmitt, Andrii Matviienko, Dominik Schön, Sebastian Günther, Thomas Kosch, and Martin Schmitz. 2023. TicTacToes: Assessing Toe Movements as an Input Modality. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (Hamburg, Germany) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 520, 17 pages. https://doi.org/10.1145/3544548.3580954
[36]
Florian Müller, Martin Schmitz, Daniel Schmitt, Sebastian Günther, Markus Funk, and Max Mühlhäuser. 2020. Walk The Line: Leveraging Lateral Shifts of the Walking Path as an Input Modality for Head-Mounted Displays. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–15. https://doi.org/10.1145/3313831.3376852
[37]
Donald A Neumann 2002. Kinesiology of the musculoskeletal system. elsevier, Amsterdam, Netherlands. 627 pages.
[38]
Niels C. Nilsson, Stefania Serafin, Morten H. Laursen, Kasper S. Pedersen, Erik Sikström, and Rolf Nordahl. 2013. Tapping-In-Place: Increasing the naturalness of immersive walking-in-place locomotion through novel gestural input. In 2013 IEEE Symposium on 3D User Interfaces (3DUI). IEEE, New York, NY, USA, 31–38. https://doi.org/10.1109/3DUI.2013.6550193
[39]
Volker Paelke, Christian Reimann, and Dirk Stichling. 2004. Kick-up Menus. In CHI ’04 Extended Abstracts on Human Factors in Computing Systems (Vienna, Austria) (CHI EA ’04). Association for Computing Machinery, New York, NY, USA, 1552. https://doi.org/10.1145/985921.986128
[40]
G. Pearson and M. Weiser. 1988. Exploratory Evaluation of a Planar Foot-Operated Cursor-Positioning Device. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Washington, D.C., USA) (CHI ’88). Association for Computing Machinery, New York, NY, USA, 13–18. https://doi.org/10.1145/57167.57169
[41]
Glenn Pearson and Mark Weiser. 1988. Exploratory Evaluations of Two Versions of a Foot-Operated Cursor-Positioning Device in a Target-Selection Task. SIGCHI Bull. 19, 3 (jan 1988), 70–75. https://doi.org/10.1145/49108.1046356
[42]
Chandra Prakash, Rajesh Kumar, and Namita Mittal. 2018. Recent developments in human gait research: parameters, approaches, applications, machine learning techniques, datasets and challenges. Artificial Intelligence Review 49 (2018), 1–40.
[43]
Helge Rhodin, Christian Richardt, Dan Casas, Eldar Insafutdinov, Mohammad Shafiei, Hans-Peter Seidel, Bernt Schiele, and Christian Theobalt. 2016. EgoCap: egocentric marker-less motion capture with two fisheye cameras. ACM Trans. Graph. 35, 6, Article 162 (dec 2016), 11 pages. https://doi.org/10.1145/2980179.2980235
[44]
William Saunders and Daniel Vogel. 2015. The performance of indirect foot pointing using discrete taps and kicks while standing. In Proceedings of the 41st Graphics Interface Conference (Halifax, Nova Scotia, Canada) (GI ’15). Canadian Information Processing Society, CAN, 265–272.
[45]
William Saunders and Daniel Vogel. 2016. Tap-Kick-Click: Foot Interaction for a Standing Desk. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (Brisbane, QLD, Australia) (DIS ’16). Association for Computing Machinery, New York, NY, USA, 323–333. https://doi.org/10.1145/2901790.2901815
[46]
Christine Schäfer, Barbara Mayr, Maria Dolores Fernandez La Puente de Battre, Bernhard Reich, Christian Schmied, Martin Loidl, David Niederseer, and Josef Niebauer. 2020. Health effects of active commuting to work: The available evidence before GISMO. Scandinavian journal of medicine & science in sports 30 (2020), 8–14.
[47]
Dominik Schmidt, Raf Ramakers, Esben W. Pedersen, Johannes Jasper, Sven Köhler, Aileen Pohl, Hannes Rantzsch, Andreas Rau, Patrick Schmidt, Christoph Sterz, Yanina Yurchenko, and Patrick Baudisch. 2014. Kickables: Tangibles for Feet. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 3143–3152. https://doi.org/10.1145/2556288.2557016
[48]
Jeremy Scott, David Dearman, Koji Yatani, and Khai N. Truong. 2010. Sensing Foot Gestures from the Pocket. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology (New York, New York, USA) (UIST ’10). Association for Computing Machinery, New York, NY, USA, 199–208. https://doi.org/10.1145/1866029.1866063
[49]
Marie B. Semaan, Laura Wallard, Valentin Ruiz, Christophe Gillet, Sébastien Leteneur, and Emilie Simoneau-Buessinger. 2022. Is treadmill walking biomechanically comparable to overground walking? A systematic review. Gait & Posture 92 (2022), 249–257. https://doi.org/10.1016/j.gaitpost.2021.11.009
[50]
Boris Smus and Vassilis Kostakos. 2010. Running gestures: hands-free interaction during physical activity. In Proceedings of the 12th ACM International Conference Adjunct Papers on Ubiquitous Computing - Adjunct (Copenhagen, Denmark) (UbiComp ’10 Adjunct). Association for Computing Machinery, New York, NY, USA, 433–434. https://doi.org/10.1145/1864431.1864473
[51]
T.E. Starner. 2002. The role of speech input in wearable computing. IEEE Pervasive Computing 1, 3 (2002), 89–93. https://doi.org/10.1109/MPRV.2002.1037727
[52]
Denis Tome, Thiemo Alldieck, Patrick Peluse, Gerard Pons-Moll, Lourdes Agapito, Hernan Badino, and Fernando de la Torre. 2023. SelfPose: 3D Egocentric Pose Estimation From a Headset Mounted Camera. IEEE Trans. Pattern Anal. Mach. Intell. 45, 6 (jun 2023), 6794–6806. https://doi.org/10.1109/TPAMI.2020.3029700
[53]
Eduardo Velloso, Dominik Schmidt, Jason Alexander, Hans Gellersen, and Andreas Bulling. 2015. The Feet in Human–Computer Interaction: A Survey of Foot-Based Interaction. ACM Comput. Surv. 48, 2, Article 21 (sep 2015), 35 pages. https://doi.org/10.1145/2816455
[54]
Congzhi Wang, Oana A. Dogaru, Patrick L. Strandholt, Niels C. Nilsson, Rolf Nordahl, and Stefania Serafin. 2018. Step aside: An Initial Exploration of Gestural Input for Lateral Movement during Walking-in-Place Locomotion. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology (Tokyo, Japan) (VRST ’18). Association for Computing Machinery, New York, NY, USA, Article 34, 5 pages. https://doi.org/10.1145/3281505.3281536
[55]
Weipeng Xu, Avishek Chatterjee, Michael Zollhöfer, Helge Rhodin, Pascal Fua, Hans-Peter Seidel, and Christian Theobalt. 2019. Mo2Cap2: Real-time Mobile 3D Motion Capture with a Cap-mounted Fisheye Camera. IEEE Transactions on Visualization and Computer Graphics 25, 5 (2019), 2093–2101. https://doi.org/10.1109/TVCG.2019.2898650
[56]
Wenge Xu, Hai-Ning Liang, Yuxuan Zhao, Difeng Yu, and Diego Monteiro. 2019. DMove: Directional Motion-Based Interaction for Augmented Reality Head-Mounted Displays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3290605.3300674
[57]
Tetsuya Yamamoto, Masahiko Tsukamoto, and Tomoki Yoshihisa. 2008. Foot-Step Input Method for Operating Information Devices While Jogging. In 2008 International Symposium on Applications and the Internet. IEEE, New York, NY, USA, 173–176. https://doi.org/10.1109/SAINT.2008.97
[58]
Zihan Yan, Jiayi Zhou, Yufei Wu, Guanhong Liu, Danli Luo, Zihong Zhou, Haipeng Mi, Lingyun Sun, Xiang ’Anthony’ Chen, Ye Tao, Yang Zhang, and Guanyun Wang. 2022. Shoes++: A Smart Detachable Sole for Social Foot-to-foot Interaction. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 6, 2, Article 85 (jul 2022), 29 pages. https://doi.org/10.1145/3534620
[59]
Yun Zhou, Tao Xu, Bertrand David, and René Chalon. 2016. Interaction on-the-go: a fine-grained exploration on wearable PROCAM interfaces and gestures in mobile situations. Universal Access in the Information Society 15 (2016), 643–657. https://doi.org/10.1007/s10209-015-0448-6

Index Terms

  1. Gait Gestures: Examining Stride and Foot Strike Variation as an Input Method While Walking

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    UIST '24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology
    October 2024
    2334 pages
    ISBN:9798400706288
    DOI:10.1145/3654777
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 October 2024

    Check for updates

    Author Tags

    1. foot-based gesture
    2. interaction technique
    3. mixed reality
    4. walking

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Canada Foundation for Innovation Infrastructure Fund
    • NSERC Discovery Grant

    Conference

    UIST '24

    Acceptance Rates

    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 740
      Total Downloads
    • Downloads (Last 12 months)740
    • Downloads (Last 6 weeks)318
    Reflects downloads up to 27 Feb 2025

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media