skip to main content
10.1145/2678025.2701394acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

Predicting Task Execution Time on Natural User Interfaces based on Touchless Hand Gestures

Published: 18 March 2015 Publication History

Abstract

Model-based evaluation has been widely used in HCI. However, current predictive models are insufficient to evaluate Natural User Interfaces based on touchless hand gestures. The purpose of this paper is to present a model based on KLM to predict performance time for doing tasks using this interface type. The required model operators were defined considering the temporal structure of hand gestures (i.e. using gesture units) and performing a systematic bibliographic review. The times for these operators were estimated by a user study consisting of various parts. Finally, the model empirical evaluation gave acceptable results (root-mean-square error = 10%, R2 = 0.936) when compared to similar models developed for other interaction styles. Thus, the proposed model should be helpful to software designers to carry out usability assessments by predicting performance time without user participation.

References

[1]
Annett, M., and Bischof, W. Your Left Hand Can Do It Too! Investigating Intermanual, Symmetric Gesture Transfer on Touchscreens. Proc. CHI 2013, ACM Press (2013), 1119--1128.
[2]
Argelaguet, F., and Andujar, C. A survey of 3D object selection techniques for virtual environments. Comput. Grap. 37, 3 (2013), 121--136.
[3]
Ball, R., and Hourcade, J. P. Rethinking Reading for Age From Paper and Computers. Int. Journal of Human-Computer Interaction 27, 11 (2011), 1066--1082.
[4]
Barclay, K., Wei, D., Lutteroth, C., and Sheehan, R. A Quantitative Quality Model for Gesture Based User Interfaces. Proc. OzCHI 2011, ACM Press (2011), 3139.
[5]
Blake, J. The natural user interface revolution. In Natural User Interfaces in .Net, Manning, 2012, 1--43.
[6]
Cao, X., and Zhai, S. Modeling Human Performance of Pen Stroke Gestures. Proc. CHI 2007, ACM Press (2007), 1495--1504.
[7]
Card, S., Moran, T., and Newell, A. The KeystrokeLevel Model for User Performance Time with Interactive Systems. Communications of the ACM 23, 7 (1980), 396--410.
[8]
Card, S., Moran, T., and Newell, A. The Psychology of Human-Computer Interaction. L. Erlbaum Assoc., 1983.
[9]
Erazo, O., and Pino, J. Estimating the Difficulty of Touchless Hand Gestures. IEEE Latin America Transactions 12, 1 (2014), 17--22.
[10]
Fitts, P. M. The information capacity of the human motor system in controlling the amplitude of movement. J. of Exp. Psychology 47, 6 (1980), 381--391.
[11]
Hayashi, E., Maas, M., and Hong, J. I. Wave to Me: User Identification Using Body Lengths and Natural Gestures. Proc. CHI 2014, ACM Press (2014), 3453--3462.
[12]
Holleis, P., Otto, F., Hussmann, H., and Schmidt, A. Keystroke-Level Model forAdvanced Mobile Phone Interaction. Proc. CHI 2007, ACM Press (2007), 1505--1514.
[13]
Isokoski, P. Model for Unistroke Writing Time. Proc. CHI 2001, ACM Press (2001), 357--364. Seattle.
[14]
Kendon, A. Gesture units, gesture phrases and speech. In Gesture: Visible action as utterance. Cambridge University Press, 2004, 108--126.
[15]
Kieras, D. Using the Keystroke-Level Model to Estimate Execution Times. University of Michigan, 2001. http://wwwpersonal.umich.edu/~itm/688/KierasKLMTutorial2001. pdf.
[16]
Kita, S., Van Gijn, I., and Van der Hulst, H. Movement Phases in Signs and Co-Speech Gestures, and their Transcription by Human Coders. Proc. Int. Gesture Workshop on Gesture and Sign Language in HumanComputer Interaction 1998, Springer (1997), 23--35.
[17]
Kitchenham, B. Procedures for performing systematic reviews. Dept. Comp. Sc., Keele University, 2004.
[18]
Kölsch, M., Beall, A. C., and Turk, M. The postural comfort zone for reaching gestures. Proc. Human Factors and Ergonomics Society 2003, SAGE Publications (2003), 787--791.
[19]
Livingston, M. A., Sebastian, J., Ai, Z., and Decker, J. W. Performance Measurements for the Microsoft Kinect Skeleton. Virtual Reality 2012, IEEE (2012), 119--120.
[20]
Luo, L., and John, B. E. Predicting Task Execution Time on Handheld Devices Using the Keystroke-Level Model. Ext. Abstracts CHI 2005, ACM Press (2005), 1605--1608.
[21]
MacKenzie, I. S. Motor Behavior Models for HumanComputer Interaction. In HCI Models, Theories, and Frameworks, J. M. Carroll, Ed. Morgan Kaufman, 2003, 27--54.
[22]
MacKenzie, I. S. Modeling Interaction. In HumanComputer Interaction: An Empirical Research Perspective, Morgan Kaufmann, 2013, 233--283.
[23]
McNeill, D. Guide to Gesture Classification, Transcription, and Distribution. In Hand and Mind: What Gestures Reveal about Thought, University of Chicago Press, 1992, 75--104.
[24]
Microsoft Corporation. Kinect for Windows Human Interface Guidelines v1.8.0. 2013. http://msdn.microsoft.com/en-us/library/jj663791.aspx
[25]
Müller-Tomfelde, C. Dwell-Based Pointing in Applications of Human Computer Interaction. Proc. Interact 2007, Springer (2007), 560--573.
[26]
Murata, A., and Iwase, H. Extending Fitts' law to a three-dimensional pointing task. Hum. Movement Sci. 20, 6 (2001), 791--805.
[27]
Neff, M., Kipp, M., Albrecht, I., and Seidel, H. P. Gesture Modeling and Animation Based on a Probabilistic. ACM T. Graphic. 27, 1 (2008).
[28]
Norman, D. Natural user interfaces are not natural. Interactions 17, 3 (2010), 6--10.
[29]
Pino, A., Tzemis, E., Ioannou, N., and Kouroupetroglou, G. Using Kinect for 2D and 3D Pointing Tasks: Performance Evaluation. Proc. HCII 2013, Springer (2013), 358--367.
[30]
Polacek, O., Klíma, M., Sporka, A. J., Zak, P., Hradis, M., Zemcik, P., and Procházka, V. A Comparative Study on Distant Free-Hand Pointing. Proc. EuroiTV 2012, ACM Press (2012), 139--142.
[31]
Sambrooks, L., and Wilkinson, B. Comparison of gestural, touch, and mouse interaction with Fitts' law. Proc. OzCHI 2013, ACM Press (2013), 119--122.
[32]
Schwaller, M., and Lalanne, D. Pointing in the Air: Measuring the Effect of Hand Selection Strategies on Performance and Effort. Proc. SouthCHI 2013, Springer (2013), 732--747.
[33]
Senin, P. Dynamic time warping algorithm review. Information and Computer Science Department, University of Hawaii at Manoa, 2008.
[34]
Song, K., Kim, J., Cho, Y. H., Lee, A., Ryu, H., Choi, J. W., and Lee, Y. J. The Fingerstroke-Level Model Strikes Back: A modified Keystroke-Level Model in developing a gaming UI for 4G networks. Ext. Abstracts CHI 2013, ACM Press (2013), 2359--2362.
[35]
Sutter, C., Müsseler, J., Bardos, L., Ballagas, R., and Borchers, J. (2008). The impact of gain change on perceiving one's own actions. Mensch & computer 2008, Oldenbourg Verlag (2008) 147--156.
[36]
Webb, J., and Ashley, J. NUI. In Beginning Kinect Programming with the Microsoft Kinect SDK, Apress, 2012, 170--172.
[37]
Webster, J., and Watson, R. T. Analyzing the Past to Prepare for the Future: Writing a Literature Review. MIS Quarterly 26, 2 (2002), 13--23.
[38]
Wigdor, D., and Wixon, D. The Natural User Interface. In Brave NUI World: Designing Natural User Interfaces for Touch and Gesture, Morgan Kaufmann, 2011, 9--14.
[39]
Zeng, X., Hedge, A., and Guimbretiere, F. Fitts' Law in 3D Space with Coordinated Hand Movements. Proc. Human Factors and Ergonomics Society Annual Meeting 2012, SAGE Publications (2012), 990--994.
[40]
RArif, A. S., Stuerzlinger, W., and Gordynski, A. Error Behaviours in an Unreliable In-air Gesture Recognizer. Ext. Abstracts CHI 2014, ACM Press (2014), 1603--1608.
[41]
RAumi, M. T. I., and Kratz, S. AirAuth: Towards Attack-Resilient Biometric Authentication Using In-Air Gestures. Ext. Abstracts CHI 2014, ACM Press (2014), 1585--1590.
[42]
RBossavit, B., Marzo, A., Ardaiz, O., and Pina, A. Hierarchical Menu Selection with a Body-Centered Remote Interface. Interact. Comput. (2013).
[43]
RBrown, M. A., and Stuerzlinger, W. The Performance of Un-Instrumented In-Air Pointing. Proc.GI 2014, ACM Press (2014), 59--66.
[44]
R. Chattopadhyay, D., and Bolchini, D. Touchless Circular Menus: Toward an Intuitive UI for Touchless Interactions with Large Displays. Proc. AVI 2014, ACM Press (2014), 33--40
[45]
R. Cockburn, A., Quinn, P., Gutwin, C., Ramos, G., and Looser, J. Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback. Int. J. of Human-Computer Studies 69, 6 (2011), 401--414.
[46]
R. Cohen, L., Haliyo, S., Chetouani, M., and Régnier, S. Intention prediction approach to interact naturally with the microworld. Proc. AIM 2014, IEEE (2014).
[47]
R. Dias, T., Variz, M., Jorge, P., and Jesus, R. Gesture Interaction System for Social Web Applications on Smart TVs. Proc. OAIR 2013, CID (2013), 225--226.
[48]
R. Drossis, G., Grammenos, D., Birliraki, C., and Stephanidis, C. MAGIC: Developing a Multimedia Gallery Supporting mid-Air Gesture-Based Interaction and Control. Posters' Ext. Abstracts HCII 2013, Springer (2013), 303--307.
[49]
R. Garzotto, F., and Valoriani, M. Touchless Gestural Interaction with Small Displays: A Case Study. Proc. CHItaly 2013, ACM Press (2013).
[50]
R. Gatto, I., and Pittarello, F. Prototyping a Gestural Interface for Selecting and Buying Goods in a Public Environment. Proc. AVI 2012, ACM Press (2012), 784--785.
[51]
R. Giovanni, S., Choi, Y. C., Huang, J., Khoo, E. T., and Yin, K. Virtual Try-On Using Kinect and HD Camera. Proc. MIG 2012, Springer (2012), 55--65.
[52]
R. Hartmann, F., and Schlaefer, A. Feasibility of touch-less control of operating room lights. Int. J. CARS 8, 2 (2013), 259--268.
[53]
R. Hayashi, E., Maas, M., and Hong, J. I. Wave to Me: User Identification Using Body Lengths and Natural Gestures. Proc. CHI 2014, ACM Press (2014), 3453--3462.
[54]
R. Hespanhol, L., Tomitsch, M., Grace, K., Collins, A., and Kay, J. Investigating Intuitiveness and Effectiveness of Gestures for Free Spatial Interaction with Large Displays. Proc. PerDis 2012, ACM Press (2012).
[55]
R. Ho, K., and Weng, H. Favoured Attributes of In-Air Gestures in the Home Environment. Proc. OzCHI 2013, ACM Press (2013), 171--174.
[56]
R. Hosseini, M., Vallotton, P., Bednarz, T., and Sowmya, A. A study of touchless versus touch-based interactions with bacterial biofilm images. Proc. VRCAI 2013, ACM Press (2013), 285--290.
[57]
R. Hötker, A. M., Pitton, M. B., Mildenberger, P., and Düber, C. Speech and motion control for interventional radiology: requirements and feasibility. Int. J. CARS 8, 6 (2013), 997--1002.
[58]
R. Jacob, M. G., and Wachs, J. P. Context-based hand gesture recognition for the operating room. Pattern Recogn. Lett. 36, (2014), 196--203.
[59]
R. Jalaliniya, S., Smith, J., Sousa, M., Büthe, L., and Pederson, T. Touch-less interaction with medical images using hand & foot gestures. Proc. UbiComp 2013, ACM Press (2013), 1265--1274.
[60]
R. Kim, Y., Sim, S., Cho, S., Lee, W. W., Jeong, Y. S., Cho, K., and Um, K. Intuitive NUI for Controlling Virtual Objects Based on Hand Movements. Future Information Technology, Springer (2014), 457--4
[61]
R. Lee, Y. H., Wu, S. K., and Liu, Y. P. Performance of remote target pointing hand movements in a 3D environment. Hum. Movement Sci. 32, 3 (2013), 511--526.
[62]
R. Lim, C. J., and Jung, Y. G. A Study on the Usability Testing of Gesture Tracking-Based Natural User Interface. Posters' Ext. Abstracts HCII 2013, Springer (2013), 139--143.
[63]
R. Löcken, A., Hesselmann, T., Pielot, M., Henze, N., and Boll, S. User-centred process for the definition of free-hand gestures applied to controlling music playback. Multimedia systems 18, 1 (2012), 15--31.
[64]
R. Loehmann, S., Knobel, M., Lamara, M., and Butz, A. Culturally Independent Gestures for In-Car Interactions. Proc. Interact 2013, Springer (2013), 538--545.
[65]
R. Markussen, A., Jakobsen, M. R., and Hornbæk, K. Selection-Based Mid-Air Text Entry on Large Displays. Proc. Interact 2013, Springer (2013), 401--418.
[66]
R. Mehler, A., vor der Brück, T., and Lücking, A. Comparing Hand Gesture Vocabularies for HCI. Proc. HCII 2014, Springer (2014), 81--92.
[67]
R. Meneses, A., and Hernández, E. Kinect©, as Interaction Device with a Tiled Display. Proc. HCII 2013, Springer (2013), 301--311.
[68]
R. Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., and Mackay, W. Mid-air Pan-and-Zoom on Wall-sized Displays. Proc. CHI 2011, ACM Press (2011), 177--186.
[69]
R. Panger, G. Kinect in the Kitchen: Testing Depth Camera Interactions in Practical Home Environments. Ext. Abstracts CHI 2012, ACM Press (2012), 1985--1990.
[70]
R. Pino, A., Tzemis, E., Ioannou, N., and Kouroupetroglou, G. Using Kinect for 2D and 3D Pointing Tasks: Performance Evaluation. Proc. HCII 2013, Springer (2013), 358--367.
[71]
R. Polacek, O., Klíma, M., Sporka, A. J., Zak, P., Hradis, M., Zemcik, P., and Procházka, V. A Comparative Study on Distant Free-Hand Pointing. Proc. EuroiTV 2012, ACM Press (2012), 139--142.
[72]
R. Re, G. M., and Bordegoni, M. A Natural User Interface for Navigating in Organized 3D Virtual Contents. Proc. VAMR 2014, Springer (2014), 93--104.
[73]
R. Ren, G., and O'Neill, E. 3D selection with freehand gesture. Comput. Grap. 37, 3 (2013), 101--120.
[74]
R. Ren, G., and O'Neill, E. Freehand Gestural Text Entry for Interactive TV. Proc. EuroiTV 2013, ACM Press (2013), 121--130.
[75]
R. Rovelo Ruiz, G. A., Vanacken, D., Luyten, K., Abad, F., and Camahort, E. Multi-Viewer Gesture-Based Interaction for Omni-Directional Video. Proc. CHI 2014, ACM Press (2014), 4077--4086.
[76]
R. Rümelin, S., Marouane, C., and Butz, A. Free-hand Pointing for Identification and Interaction with Distant Objects. Proc. AutomotiveUI 2013, ACM Press (2013), 40--47.
[77]
R. Salvador, R., Romão, T., and Centieiro, P. A Gesture Interface Game for Energy Consumption Awareness. Proc. ACE 2012, Springer (2012), 352--367.
[78]
R. Sambrooks, L., and Wilkinson, B. Comparison of gestural, touch, and mouse interaction with Fitts' law. Proc. OzCHI 2013, ACM Press (2013), 119--122.
[79]
R40.Sousa, T., Cardoso, I., Parracho, J., Dias, P., and Santos, B. S. DETI-Interact: Interaction with Large Displays in Public Spaces Using the Kinect. Proc. DAPI 2014, Springer (2014), 196--206.
[80]
R. Tan, J. H., Chao, C., Zawaideh, M., Roberts, A. C., and Kinney, T. B. Informatics in Radiology: Developing a Touchless User Interface for Intraoperative Image Control during Interventional Radiology Procedures. Radiographics 33, 2 (2013), E61-E70.
[81]
R. Tang, J. C., Xiao, R., Hoff, A., Venolia, G., Therien, P., and Roseway, A. HomeProxy: Exploring a Physical Proxy for Video Communication in the Home. Proc. CHI 2013, ACM Press (2013), 1339--1342.
[82]
R. Tian, J., Qu, C., Xu, W., and Wang, S. KinWrite: Handwriting-Based Authentication Using Kinect. Proc. Annual Network & Distributed System Security Symposium 2013.
[83]
R. Tuntakurn, A., Thongvigitmanee, S. S., Sa-Ing, V., Hasegawa, S., and Makhanov, S. S. Natural Interactive 3D Medical Image Viewer Based on Finger and Arm Gestures. Proc. BMEiCON 2013, IEEE (2013).
[84]
R. Vaidyanathan, V., and Rosenberg, D. "Will Use It, Because I Want to Look Cool" A Comparative Study of Simple Computer Interactions Using Touchscreen and In-Air Hand Gestures. Proc. HCII 2014, Springer (2014), 170--181.
[85]
R. Van de Camp, F., Schick, A., and Stiefelhagen, R. How to Click in Mid-Air. Proc. DAPI/HCII 2013, Springer (2013), 78--
[86]
R. Vatavu, R. D. A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments. J. of Ambient Intelligence and Smart Env. 5, 2 (2013), 187--211
[87]
R. Vatavu, R. D. User-Defined Gestures for Free-Hand TV Control. Proc. EuroiTV 2012, ACM Press (2012), 45--48.
[88]
R. Vatavu, R. D., and Zaiţi, I. Leap Gestures for TV: Insights from an Elicitation Study. Proc. TVX 2014, ACM Press (2014), 131--138.
[89]
R50. Wu, H., and Wang, J. User-Defined Body Gestures for TV-based Applications. Proc. ICDH 2012, IEEE (2012), 415--420.
[90]
R. Yim, J., Lee, K., and Kim, H. Implementation of a Preliminary Natural User Interface for Video on Demand Systems. Future Information Technology, Springer (2014), 475--480.
[91]
R. Zarzuela, M. M., Pernas, F. J. D., Calzón, S. M., Ortega, D. G., and Rodríguez, M. A. Educational Tourism Through a Virtual Reality Platform. Proc. Computer Science 25, Elsevier (2013), 382--388.

Cited By

View all
  • (2023)Evaluating User Interactions in Wearable Extended Reality: Modeling, Online Remote Survey, and In-Lab Experimental MethodsIEEE Access10.1109/ACCESS.2023.329859811(77856-77872)Online publication date: 2023
  • (2022)Performance Evaluation of HMI based on AHP and GRT for GUI2022 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)10.1109/IICAIET55139.2022.9936844(1-5)Online publication date: 13-Sep-2022
  • (2022)Predicting Human Performance in Vertical Hierarchical Menu Selection in Immersive AR Using Hand-gesture and Head-gaze2022 15th International Conference on Human System Interaction (HSI)10.1109/HSI55341.2022.9869495(1-8)Online publication date: 28-Jul-2022
  • Show More Cited By

Index Terms

  1. Predicting Task Execution Time on Natural User Interfaces based on Touchless Hand Gestures

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    IUI '15: Proceedings of the 20th International Conference on Intelligent User Interfaces
    March 2015
    480 pages
    ISBN:9781450333061
    DOI:10.1145/2678025
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 March 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. mid-air gestures
    2. natural user interfaces
    3. predictive model
    4. touchless gestures

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    IUI'15
    Sponsor:

    Acceptance Rates

    IUI '15 Paper Acceptance Rate 47 of 205 submissions, 23%;
    Overall Acceptance Rate 746 of 2,811 submissions, 27%

    Upcoming Conference

    IUI '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)19
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Evaluating User Interactions in Wearable Extended Reality: Modeling, Online Remote Survey, and In-Lab Experimental MethodsIEEE Access10.1109/ACCESS.2023.329859811(77856-77872)Online publication date: 2023
    • (2022)Performance Evaluation of HMI based on AHP and GRT for GUI2022 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)10.1109/IICAIET55139.2022.9936844(1-5)Online publication date: 13-Sep-2022
    • (2022)Predicting Human Performance in Vertical Hierarchical Menu Selection in Immersive AR Using Hand-gesture and Head-gaze2022 15th International Conference on Human System Interaction (HSI)10.1109/HSI55341.2022.9869495(1-8)Online publication date: 28-Jul-2022
    • (2020)Externalizing Mental Images by Harnessing Size-Describing GesturesProceedings of the 2020 International Conference on Advanced Visual Interfaces10.1145/3399715.3399920(1-9)Online publication date: 28-Sep-2020
    • (2020)Evaluating the Scalability of Non-Preferred Hand Mode Switching in Augmented RealityProceedings of the 2020 International Conference on Advanced Visual Interfaces10.1145/3399715.3399850(1-9)Online publication date: 28-Sep-2020
    • (2020)Usability and user experience evaluation of natural user interfaces: a systematic mapping studyIET Software10.1049/iet-sen.2020.0051Online publication date: 21-Jul-2020
    • (2018)Analyzing Mid-Air Hand Gestures to Confirm Selections on DisplaysTechnology Trends10.1007/978-3-030-05532-5_25(341-352)Online publication date: 30-Dec-2018
    • (2017)Designing hand gesture interfaces for easing students participation from their spot2017 IEEE 21st International Conference on Computer Supported Cooperative Work in Design (CSCWD)10.1109/CSCWD.2017.8066683(133-138)Online publication date: Apr-2017
    • (2017)Blind FLM: An Enhanced Keystroke-Level Model for Visually Impaired Smartphone InteractionHuman-Computer Interaction - INTERACT 201710.1007/978-3-319-67744-6_10(155-172)Online publication date: 20-Sep-2017
    • (2017)Understanding Gesture Articulations VariabilityHuman-Computer Interaction - INTERACT 201710.1007/978-3-319-67684-5_18(293-314)Online publication date: 20-Sep-2017
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media