skip to main content
10.1145/3359997.3365704acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
research-article

FEETICHE: FEET Input for Contactless Hand gEsture Interaction

Published: 14 November 2019 Publication History

Abstract

Foot input has been proposed to support hand gestures in many interactive contexts, however, little attention has been given contactless 3D object manipulation. This is important since many applications, namely sterile surgical theaters require contactless operation. However, relying solely on hand gestures makes it difficult to specify precise interactions since hand movements are difficult to segment into command and interaction modes. The unfortunate results range from unintended activations, to noisy interactions and misrecognized commands. In this paper, we present FEETICHE a novel set of multi-modal interactions combining hand and foot input for supporting contactless 3D manipulation tasks, while standing in front of large displays driven by foot tapping and heel rotation. We use depth sensing cameras to capture both hand and feet gestures, and developed a simple yet robust motion capture method to track dominant foot input. Through two experiments, we assess how well foot gestures support mode switching and how this frees the hands to perform accurate manipulation tasks. Results indicate that users effectively rely on foot gestures to improve mode switching and reveal improved accuracy on both rotation and translation tasks.

References

[1]
Jason Alexander, Teng Han, William Judd, Pourang Irani, and Sriram Subramanian. 2012. Putting Your Best Foot Forward: Investigating Real-world Mappings for Foot-based Gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’12). ACM, New York, NY, USA, 1229–1238. https://doi.org/10.1145/2207676.2208575
[2]
Ferran Argelaguet and Carlos Andujar. 2013. A survey of 3D object selection techniques for virtual environments. Computers & Graphics 37, 3 (2013), 121 – 136. https://doi.org/10.1016/j.cag.2012.12.003
[3]
Thomas Augsten, Konstantin Kaefer, René Meusel, Caroline Fetzer, Dorian Kanitz, Thomas Stoff, Torsten Becker, Christian Holz, and Patrick Baudisch. 2010. Multitoe: High-precision Interaction with Back-projected Floors Based on High-resolution Multi-touch Input. In Proceedings of the 23Nd Annual ACM Symposium on User Interface Software and Technology(UIST ’10). ACM, New York, NY, USA, 209–218. https://doi.org/10.1145/1866029.1866064
[4]
Ravin Balakrishnan, George Fitzmaurice, Gordon Kurtenbach, and Karan Singh. 1999. Exploring Interactive Curve and Surface Manipulation Using a Bend and Twist Sensitive Input Strip. In Proceedings of the 1999 Symposium on Interactive 3D Graphics(I3D ’99). ACM, New York, NY, USA, 111–118. https://doi.org/10.1145/300523.300536
[5]
Doug A. Bowman, Ernst Kruijff, Joseph J. LaViola, and Ivan Poupyrev. 2004. 3D User Interfaces: Theory and Practice. Addison Wesley Longman Publishing Co., Inc., Redwood City, CA, USA.
[6]
I. Choi and C. Ricci. 1997. Foot-mounted gesture detection and its application in virtual environments. In 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation, Vol. 5. 4248–4253 vol.5. https://doi.org/10.1109/ICSMC.1997.637367
[7]
Andrew Crossan, Stephen Brewster, and Alexander Ng. 2010. Foot Tapping for Mobile Interaction. In Proceedings of the 24th BCS Interaction Specialist Group Conference(BCS ’10). British Computer Society, Swinton, UK, UK, 418–422.
[8]
David Dearman, Amy Karlson, Brian Meyers, and Ben Bederson. 2010. Multi-modal Text Entry and Selection on a Mobile Device. In Proceedings of Graphics Interface 2010(GI ’10). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 19–26.
[9]
Fabian Göbel, Konstantin Klamka, Andreas Siegel, Stefan Vogt, Sophie Stellmach, and Raimund Dachselt. 2013. Gaze-supported Foot Interaction in Zoomable Information Spaces. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’13). ACM, New York, NY, USA, 3059–3062. https://doi.org/10.1145/2468356.2479610
[10]
A. P. Gritti, O. Tarabini, J. Guzzi, G. A. Di Caro, V. Caglioti, L. M. Gambardella, and A. Giusti. 2014. Kinect-based people detection and tracking from small-footprint ground robots. In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. 4096–4103. https://doi.org/10.1109/IROS.2014.6943139
[11]
Teng Han, Jason Alexander, Abhijit Karnik, Pourang Irani, and Sriram Subramanian. 2011. Kick: Investigating the Use of Kick Gestures for Mobile Interactions. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services(MobileHCI ’11). ACM, New York, NY, USA, 29–32. https://doi.org/10.1145/2037373.2037379
[12]
Ken Hinckley, Randy Pausch, John C. Goble, and Neal F. Kassell. 1994. A Survey of Design Issues in Spatial Input. In Proceedings of the 7th Annual ACM Symposium on User Interface Software and Technology(UIST ’94). ACM, New York, NY, USA, 213–222. https://doi.org/10.1145/192426.192501
[13]
Errol R. Hoffmann. 1991. A comparison of hand and foot movement times. Ergonomics 34, 4 (1991), 397–406. https://doi.org/10.1080/00140139108967324
[14]
Christian Holz and Andrew Wilson. 2011. Data miming: inferring spatial object descriptions from human gesture. In CHI 2001.
[15]
Shahram Jalaliniya, Jeremiah Smith, Miguel Sousa, Lars Büthe, and Thomas Pederson. 2013. Touch-less Interaction with Medical Images Using Hand & Foot Gestures. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication(UbiComp ’13 Adjunct). ACM, New York, NY, USA, 1265–1274. https://doi.org/10.1145/2494091.2497332
[16]
Ricardo Jota, Pedro Lopes, Daniel Wigdor, and Joaquim Jorge. 2014. Let’s Kick It: How to Stop Wasting the Bottom Third of Your Large Screen Display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’14). ACM, New York, NY, USA, 1411–1414. https://doi.org/10.1145/2556288.2557316
[17]
Daniel Simões Lopes, Pedro Duarte de Figueiredo Parreira, Soraia Figueiredo Paulo, Vitor Nunes, Paulo Amaral Rego, Manuel Cassiano Neves, Pedro Silva Rodrigues, and Joaquim Armando Jorge. 2017. On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface. Journal of Biomedical Informatics 72, Supplement C (2017), 140 – 149. https://doi.org/10.1016/j.jbi.2017.07.009
[18]
A. Martinet, G. Casiez, and L. Grisoni. 2010. The design and evaluation of 3D positioning techniques for multi-touch displays. In 2010 IEEE Symposium on 3D User Interfaces (3DUI). 115–118. https://doi.org/10.1109/3DUI.2010.5444709
[19]
Daniel Mendes, Filipe Relvas, Alfredo Ferreira, and Joaquim Jorge. 2016. The Benefits of DOF Separation in Mid-air 3D Object Manipulation. In Proceedings of the 22Nd ACM Conference on Virtual Reality Software and Technology(VRST ’16). ACM, New York, NY, USA, 261–268. https://doi.org/10.1145/2993369.2993396
[20]
Sharon Oviatt. 1999. Ten Myths of Multimodal Interaction. Commun. ACM 42, 11 (Nov. 1999), 74–81. https://doi.org/10.1145/319382.319398
[21]
Volker Paelke, Christian Reimann, and Dirk Stichling. 2004. Foot-based Mobile Interaction with Games. In Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology(ACE ’04). ACM, New York, NY, USA, 321–324. https://doi.org/10.1145/1067343.1067390
[22]
Toni Pakkanen and Roope Raisamo. 2004. Appropriateness of Foot Interaction for Non-accurate Spatial Tasks. In CHI ’04 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’04). ACM, New York, NY, USA, 1123–1126. https://doi.org/10.1145/985921.986004
[23]
Joseph A. Paradiso and Eric Hu. 1997. Expressive Footwear for Computer-Augmented Dance Performance. In Proceedings of the 1st IEEE International Symposium on Wearable Computers(ISWC ’97). IEEE Computer Society, Washington, DC, USA, 165–.
[24]
Joseph A. Paradiso, Stacy J. Morris, Ari Y. Benbasat, and Erik Asmussen. 2004. Interactive Therapy with Instrumented Footwear. In CHI ’04 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’04). ACM, New York, NY, USA, 1341–1343. https://doi.org/10.1145/985921.986059
[25]
G. Pearson and M. Weiser. 1986. Of Moles and Men: The Design of Foot Controls for Workstations. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’86). ACM, New York, NY, USA, 333–339. https://doi.org/10.1145/22627.22392
[26]
G. Pearson and M. Weiser. 1988. Exploratory Evaluation of a Planar Foot-operated Cursor-positioning Device. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’88). ACM, New York, NY, USA, 13–18. https://doi.org/10.1145/57167.57169
[27]
A.F. Rovers and H.A. van Essen. 2006. Guidelines for haptic interpersonal communication applications: an exploration of foot interaction styles. Virtual Reality 9, 2 (01 Mar 2006), 177–191. https://doi.org/10.1007/s10055-005-0016-0
[28]
Nuttapol Sangsuriyachot, Haipeng Mi, and Masanori Sugimoto. 2011. Novel Interaction Techniques by Combining Hand and Foot Gestures on Tabletop Environments. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces(ITS ’11). ACM, New York, NY, USA, 268–269. https://doi.org/10.1145/2076354.2076409
[29]
William Saunders and Daniel Vogel. 2015. The performance of indirect foot pointing using discrete taps and kicks while standing. In Proceedings of Graphics Interface 2015(GI 2015). Canadian Human-Computer Communications Society, Toronto, Ontario, Canada, 265–272. https://doi.org/10.20380/GI2015.34
[30]
William Saunders and Daniel Vogel. 2016. Tap-Kick-Click: Foot Interaction for a Standing Desk. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems(DIS ’16). ACM, 323–333. https://doi.org/10.1145/2901790.2901815
[31]
Johannes Schöning, Florian Daiber, Antonio Krüger, and Michael Rohs. 2009. Using Hands and Feet to Navigate and Manipulate Spatial Data. In CHI ’09 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’09). ACM, New York, NY, USA, 4663–4668. https://doi.org/10.1145/1520340.1520717
[32]
Jeremy Scott, David Dearman, Koji Yatani, and Khai N. Truong. 2010. Sensing Foot Gestures from the Pocket. In Proceedings of the 23Nd Annual ACM Symposium on User Interface Software and Technology(UIST ’10). ACM, New York, NY, USA, 199–208. https://doi.org/10.1145/1866029.1866063
[33]
A. L. Simeone, E. Velloso, J. Alexander, and H. Gellersen. 2014. Feet movement in desktop 3D interaction. In 2014 IEEE Symposium on 3D User Interfaces (3DUI). 71–74. https://doi.org/10.1109/3DUI.2014.6798845
[34]
Yuichiro Takeuchi. 2010. Gilded Gait: Reshaping the Urban Experience with Augmented Footsteps. In Proceedings of the 23Nd Annual ACM Symposium on User Interface Software and Technology(UIST ’10). ACM, New York, NY, USA, 185–188. https://doi.org/10.1145/1866029.1866061
[35]
Eduardo Velloso, Dominik Schmidt, Jason Alexander, Hans Gellersen, and Andreas Bulling. 2015. The Feet in Human–Computer Interaction: A Survey of Foot-Based Interaction. ACM Comput. Surv. 48, 2, Article 21 (Sept. 2015), 35 pages. https://doi.org/10.1145/2816455
[36]
Vinayak, Sundar Murugappan, HaiRong Liu, and Karthik Ramani. 2013. Shape-It-Up: Hand gesture based creative expression of 3D shapes using intelligent generalized cylinders. Computer-Aided Design 45, 2 (2013), 277 – 287. https://doi.org/10.1016/j.cad.2012.10.011 Solid and Physical Modeling 2012.
[37]
K. Zhong, F. Tian, and H. Wang. 2011. Foot Menu: Using Heel Rotation Information for Menu Selection. In 2011 15th Annual International Symposium on Wearable Computers. 115–116. https://doi.org/10.1109/ISWC.2011.10

Cited By

View all
  • (2024)Advances in the development and application of non-contact intraoperative image access systemsBioMedical Engineering OnLine10.1186/s12938-024-01304-123:1Online publication date: 30-Oct-2024
  • (2024)Multimodal human–computer interaction in interventional radiology and surgery: a systematic literature reviewInternational Journal of Computer Assisted Radiology and Surgery10.1007/s11548-024-03263-3Online publication date: 28-Oct-2024
  • (2023)Surveying the Social Comfort of Body, Device, and Environment-Based Augmented Reality Interactions in Confined Passenger Spaces Using Mixed Reality Composite VideosProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109237:3(1-25)Online publication date: 27-Sep-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRCAI '19: Proceedings of the 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry
November 2019
354 pages
ISBN:9781450370028
DOI:10.1145/3359997
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 November 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3D manipulation
  2. Foot interaction
  3. hand gestures
  4. heel rotation.
  5. large screens
  6. selection
  7. tapping

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

VRCAI '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 51 of 107 submissions, 48%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)98
  • Downloads (Last 6 weeks)9
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Advances in the development and application of non-contact intraoperative image access systemsBioMedical Engineering OnLine10.1186/s12938-024-01304-123:1Online publication date: 30-Oct-2024
  • (2024)Multimodal human–computer interaction in interventional radiology and surgery: a systematic literature reviewInternational Journal of Computer Assisted Radiology and Surgery10.1007/s11548-024-03263-3Online publication date: 28-Oct-2024
  • (2023)Surveying the Social Comfort of Body, Device, and Environment-Based Augmented Reality Interactions in Confined Passenger Spaces Using Mixed Reality Composite VideosProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109237:3(1-25)Online publication date: 27-Sep-2023
  • (2023)Towards a Consensus Gesture Set: A Survey of Mid-Air Gestures in HCI for Maximized Agreement Across DomainsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581420(1-24)Online publication date: 19-Apr-2023
  • (2023)TicTacToes: Assessing Toe Movements as an Input ModalityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580954(1-17)Online publication date: 19-Apr-2023
  • (2022)The Feet in Human-Centred Security: Investigating Foot-Based User Authentication for Public DisplaysExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3519838(1-9)Online publication date: 27-Apr-2022
  • (2022)Design requirements to improve laparoscopy via XR2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW55335.2022.00093(425-429)Online publication date: Mar-2022
  • (2022)User-Defined Foot Gestures for Eyes-Free Interaction in Smart Shower RoomsInternational Journal of Human–Computer Interaction10.1080/10447318.2022.210926039:20(4139-4161)Online publication date: 18-Aug-2022
  • (2021)A Matrix for Systematic Selection of Authentication Mechanisms in Challenging Healthcare related EnvironmentsProceedings of the 2021 ACM Workshop on Secure and Trustworthy Cyber-Physical Systems10.1145/3445969.3450424(88-97)Online publication date: 28-Apr-2021
  • (2021)An Evaluation of Eye-Foot Input for Target AcquisitionsUniversal Access in Human-Computer Interaction. Design Methods and User Experience10.1007/978-3-030-78092-0_34(499-517)Online publication date: 24-Jul-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media