skip to main content
10.1145/3123818.3123851acmotherconferencesArticle/Chapter ViewAbstractPublication PagesinteraccionConference Proceedingsconference-collections
research-article

Studying user-defined body gestures for navigating interactive maps

Published: 25 September 2017 Publication History

Abstract

Since the creation of virtual reality many interaction techniques have been proposed for navigating virtual worlds. Some of them involve the use of body gestures and voice commands, while some others rely on some other interactive mechanisms such as mouse and keyboard. Since the appearance of videogames with body interaction it caught our attention how complex is to navigate in some videogames. As the use of voice commands is absent you just rely of your body or control to navigate. We observed a lot of frustration when you rely just on body gestures. So, natural interaction seems not being so natural. In this paper we examine a user defined body gesture language to navigate virtual worlds. We use the wizard of Oz technique to collect the data related and compare performance with traditional desktop based interaction and analyze the results. As a result we propose a body gesture language to navigate virtual worlds.

References

[1]
Aliakseyeu, D., Subramanian, S., Martens, J. B., & Rauterberg, M. (2002). Interaction techniques for navigation through and manipulation of 2 D and 3 D data. In ACM International Conference Proceeding Series (Vol. 23, pp. 179--188).
[2]
Arthur, P., & Passini, R. (1992). Wayfinding: people, signs, and architecture.
[3]
Boudoin, P., Otmane, S., & Mallem, M. (2008). Design of a 3d navigation technique supporting vr interaction. In H. Arioui, R. Merzouki, & H. A. Abbassi (Eds.), AIP Conference Proceedings (Vol. 1019, No. 1, 149--153.
[4]
Bowman, D. A., Koller, D., & Hodges, L. F. (1997). Travel in immersive virtual environments: An evaluation of viewpoint motion control techniques. In Virtual Reality Annual International Symposium, 1997, IEEE 1997, 45--52.
[5]
Calvary, G., Coutaz, J., Thevenin, D., Limbourg, Q., Bouillon, L. and Vanderdonckt, J., 2003. A unifying reference framework for multi-target user interfaces. Interacting with computers, 15(3), pp.289--308.
[6]
Dow, S., Lee, J., Oezbek, C., MacIntyre, B., Bolter, J. D., Gandy, M. (2005), Wizard of Oz Interfaces for Mixed Reality Applications, CHI 2005, ACM April 2005.
[7]
Fonseca, J.M.C., Calleros, J.M.G., Meixner, G., Paterno, F., Pullmann, J., Raggett, D., Schwabe, D. and Vanderdonckt, J., 2010. Model-based ui xg final report. W3C Incubator Group Report, May, p.32.
[8]
González-Calleros, J. M., Vanderdonckt, J., & Muñoz-Arteaga, J. (2010). A Structured Methodology for Developing 3D Web Applications. Integrating Usability Engineering for Designing the Web Experience: Methodologies and Principles, 15--43.
[9]
Höysniemi, J., Hämäläinen, P., & Turkki, L. (2004). Wizard of Oz prototyping of computer vision based action games for children. In Proceedings of the 2004 conference on Interaction design and children: building a community (pp. 27--34). ACM.
[10]
Kaur, K. (1997). Designing virtual environments for usability. In Human-Computer Interaction INTERACT'97 (pp. 636--639). Springer US.
[11]
Kelley, J. F. (1984). An iterative design methodology for user-friendly natural language office information applications. ACM Transactions on Information Systems (TOIS), 2(1), 26--41.
[12]
Kray, C., Nesbitt, D., Dawson, J. and Rohs, M., 2010, September. User-defined gestures for connecting mobile phones, public displays, and tabletops. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services (pp. 239--248). ACM.
[13]
Kurdyukova, E., Redlin, M. and André, E., 2012, February. Studying user-defined iPad gestures for interaction in multi-display environment. In Proceedings of the 2012 ACM international conference on Intelligent User Interfaces (pp. 93--96). ACM.
[14]
Lewis, J.R. (1995). IBM Computer Usability Satisfaction Questionnaires: Psychometric Evaluation and Instructions for Use, International Journal of Human-Computer Interaction, Vol. 7, No. 1, 1995, pp. 57--78.
[15]
Maike, V. R. M. L., Neto, L. D. S. B., Baranauskas, M. C. C., & Goldenstein, S. K. (2014). Seeing through the Kinect: a survey on heuristics for building natural user interfaces environments. In International Conference on Universal Access in Human-Computer Interaction (pp. 407--418). Springer International Publishing.
[16]
Minocha, S., & Hardy, C. L. (2011). Designing navigation and wayfinding in 3D virtual learning spaces. In Proceedings of the 23rd Australian Computer-Human Interaction Conference (pp. 211--220). ACM.
[17]
Piumsomboon, T., Clark, A., Billinghurst, M. and Cockburn, A., 2013, April. User-defined gestures for augmented reality. In CHI'13 Extended Abstracts on Human Factors in Computing Systems (pp. 955--960). ACM.
[18]
Ruiz, J., Li, Y. and Lank, E., 2011, May. User-defined motion gestures for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 197--206). ACM.
[19]
Tan, D. S., Robertson, G. G., & Czerwinski, M. (2001). Exploring 3D navigation: combining speed-coupled flying with orbiting. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 418--425). ACM.
[20]
Valdes, C., Eastman, D., Grote, C., Thatte, S., Shaer, O., Mazalek, A., Ullmer, B. and Konkel, M.K., 2014, April. Exploring the design space of gestural interaction with active tokens through user-defined gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 4107--4116). ACM.
[21]
Vatavu, R. D., & Wobbrock, J. O., 2016. Between-Subjects Elicitation Studies: Formalization and Tool Support. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 3390--3402). ACM.
[22]
Vanderdonckt, J., 2005, June. A MDA-compliant environment for developing user interfaces of information systems. In International Conference on Advanced Information Systems Engineering (pp. 16--31). Springer Berlin Heidelberg.
[23]
Wobbrock, J.O., Morris, M.R. and Wilson, A.D., 2009, April. User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1083--1092). ACM.
[24]
Paternò, F., Mancini, C., & Meniconi, S. (1997). ConcurTaskTrees: A diagrammatic notation for specifying task models. In Human-Computer Interaction INTERACT'97 (pp. 362--369). Springer US.

Cited By

View all
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2022)Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation RobotSensors10.3390/s2301023723:1(237)Online publication date: 26-Dec-2022
  • (2022)Elicitation of Interaction Techniques with 3D Data Visualizations in Immersive Environment using HMDs2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct57072.2022.00053(238-243)Online publication date: Oct-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
Interacción '17: Proceedings of the XVIII International Conference on Human Computer Interaction
September 2017
268 pages
ISBN:9781450352291
DOI:10.1145/3123818
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 September 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. interactive maps
  2. natural interaction
  3. user defined body gesture
  4. virtual worlds

Qualifiers

  • Research-article

Funding Sources

  • PRODEP
  • CONACYT
  • BUAPVIEP

Conference

Interacción '17

Acceptance Rates

Overall Acceptance Rate 109 of 163 submissions, 67%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2022)Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation RobotSensors10.3390/s2301023723:1(237)Online publication date: 26-Dec-2022
  • (2022)Elicitation of Interaction Techniques with 3D Data Visualizations in Immersive Environment using HMDs2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct57072.2022.00053(238-243)Online publication date: Oct-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media