skip to main content
10.1145/3544549.3585914acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Work in Progress

“Just Like Blooming Fireworks, And Match With Function Perfectly”: Explore and Evaluate User-Defined One-Handed Gestures of Smartwatch

Published:19 April 2023Publication History

ABSTRACT

One-handed gesture interaction is a more convenient input interaction method on smartwatches for some special scenarios, e.g. wearing a smartwatch when running or biking. To explore user-friendly one-handed gestures, what users are thinking when using the gesture, and what characteristic would make the user feel this one-handed gesture is friendly, we developed a series of one-handed gestures for 6 basic functions of the smartwatch. The end-user elicitation method resulted in 12 new one-hand gestures. We compared these 12 user-defined one-handed gestures with the Apple Watch one-handed gestures. We developed a Wizard of Oz model and evaluated these gestures by using qualitative and quantitative approaches. The results show that we generated a set of one-handed gestures that are more friendly than the existing Apple Watch one-handed gestures. Also, during the evaluation process, we collected quantitative data and interesting user perspectives. We also gave some design recommendations for one-handed gestures.

Footnotes

  1. 1 All necessary codes can be found at https://anonymous.4open.science/r/1035-supplementary-files-E5B3/1035-supplementary%20files.zip

    Footnote
Skip Supplemental Material Section

Supplemental Material

References

  1. Emily A. Vogels. 2020. About One-in-Five Americans Use a SmartWatch or Fitness Tracker. https://www.pewresearch.org/fact-tank/2020/01/09/about-one-in-five-americans-use-a-smart-watch-or-fitness-tracker/Google ScholarGoogle Scholar
  2. Morris M R ,  Wobbrock J O ,  Wilson A D . Understanding users' preferences for Surface gestures[C]// Proceedings of the Graphics Interface 2010 Conference, May 31 - June 02, 2010, Ottawa, Ontario, Canada. Canadian Information Processing Society, 2010.Google ScholarGoogle Scholar
  3. Morris, M. R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., Schraefel, M. C., & Wobbrock, J. O. (2014). Reducing legacy bias in gesture elicitation studies. interactions, 21(3), 40-45.Google ScholarGoogle Scholar
  4. K. S. R. Murty and B. Yegnanarayana. 2006. Combining evidence from residual phase and MFCC features for speaker recognition. IEEE Signal Processing Letters 13, 1 (Jan 2006), 52–55. https://doi.org/10.1109/LSP.2005.860538Google ScholarGoogle ScholarCross RefCross Ref
  5. Kim J ,  He J ,  Lyons K , The Gesture Watch: A Wireless Contact-free Gesture based Wrist Interface[C]// IEEE International Symposium on Wearable Computers. IEEE, 2007.Google ScholarGoogle Scholar
  6. Gupta S ,  Morris D ,  Patel S , SoundWave: using the doppler effect to sense gestures. ACM, 2012.Google ScholarGoogle Scholar
  7. Loclair, C., Gustafson, S., & Baudisch, P. (2010, September). PinchWatch: a wearable device for one-handed microinteractions. In Proc. MobileHCI (Vol. 10).Google ScholarGoogle Scholar
  8. Caroline C. 2017. Here's How People Are Using Their Smartwatches. https://www.businessinsider.com/mostused-smartwatch-features-chart-2017-8Google ScholarGoogle Scholar
  9. Arif, A. S., & Mazalek, A. (2016, July). A survey of text entry techniques for smartwatches. In International Conference on Human-Computer Interaction (pp. 255-267). Springer, Cham.Google ScholarGoogle Scholar
  10. Nielsen M , M Störring, Moeslund T B , A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI[C]// International Gesture Workshop. Springer Berlin Heidelberg, 2003.Google ScholarGoogle Scholar
  11. Jiang, S., Lv, B., Guo, W., Zhang, C., Wang, H., Sheng, X., & Shull, P. B. (2017). Feasibility of wrist-worn, real-time hand, and surface gesture recognition via sEMG and IMU sensing. IEEE Transactions on Industrial Informatics, 14(8), 3376-3385.Google ScholarGoogle ScholarCross RefCross Ref
  12. Lu, Z., Chen, X., Li, Q., Zhang, X., & Zhou, P. (2014). A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices. IEEE transactions on human-machine systems, 44(2), 293-299.Google ScholarGoogle ScholarCross RefCross Ref
  13. Wobbrock J O, Aung H H, Rothrock B, Maximizing the guessability of symbolic input[C]//CHI'05 extended abstracts on Human Factors in Computing Systems. 2005: 1869-1872.Google ScholarGoogle Scholar
  14. Kundu, A. S., Mazumder, O., Lenka, P. K., & Bhaumik, S. (2018). Hand gesture recognition based omnidirectional wheelchair control using IMU and EMG sensors. Journal of Intelligent & Robotic Systems, 91(3), 529-541.Google ScholarGoogle Scholar
  15. Ren, Y., & Arif, A. S. (2021). Stepper, Swipe, Tilt, Force: Comparative Evaluation of Four Number Pickers for Smartwatches. Proceedings of the ACM on Human-Computer Interaction, 5(ISS), 1-21.Google ScholarGoogle Scholar
  16. Use auxiliary touch on the Apple Watch (Apple Watch SE and Apple Watch Series 4 and later models only) https://support.apple.com/zh-cn/guide/watch/apdec70bfd2d/watchosGoogle ScholarGoogle Scholar
  17. How to set gesture control for Huawei mobile phone https://consumer.huawei.com/cn/support/content/zh-cn15801113/Google ScholarGoogle Scholar
  18. How does tmall Genie open gesture operation https://bot.tmall.comGoogle ScholarGoogle Scholar
  19. Piumsomboon, Thammathip, "User-defined gestures for augmented reality." IFIP Conference on Human-Computer Interaction. Springer, Berlin, Heidelberg, 2013.Google ScholarGoogle Scholar
  20. Ruiz, Jaime, Yang Li, and Edward Lank. "User-defined motion gestures for mobile interaction." Proceedings of the SIGCHI conference on human factors in computing systems. 2011.Google ScholarGoogle Scholar
  21. Shimon, S.S.A. 2016. Exploring Non-touchscreen Gestures for Smartwatches. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16).Google ScholarGoogle Scholar
  22. Lu, Z., Chen, X., Li, Q., Zhang, X., & Zhou, P. (2014). A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices. IEEE transactions on human-machine systems, 44(2), 293-299.Google ScholarGoogle ScholarCross RefCross Ref
  23. A., Reis, G., & Stricker, D. (2022). AnyGesture: Arbitrary One-Handed Gestures for Augmented, Virtual, and Mixed Reality Applications. Applied Sciences, 12(4), 1888.Google ScholarGoogle Scholar
  24. Gong, J., Yang, X. D., & Irani, P. (2016, October). Wristwhirl: One-handed continuous smartwatch input using wrist gestures. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (pp. 861-872).Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Delamare, W., Silpasuwanchai, C., Sarcar, S., Shiraki, T., & Ren, X. (2019, November). On Gesture Combination: An Exploration of a Solution to Augment Gesture Interaction. In Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces (pp. 135-146).Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Cannanure, V., Xiang'Anthony'Chen, & Mankoff, J. (2016, June). Twist'n'Knock: A One-handed Gesture for Smart Watches. In Graphics Interface (pp. 189-193).Google ScholarGoogle Scholar
  27. Wu, F. G. , & Kuo, J. Y. . (2013). One-Handed Gesture Design for Browsing on Touch Phone. Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: applications and services for quality of life - Volume Part III. Springer Berlin Heidelberg.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Alexander, S. , & Gerd, R. , & Didier, S. .(2022). AnyGesture: Arbitrary One-Handed Gestures for Augmented, Virtual, and Mixed Reality Applications. Applied Sciences, 12(4)Google ScholarGoogle Scholar
  29. Thewsuwan, S. , & Chiracharit, W. . (2015). One-handed gesture based interaction for image zoom manipulation. International Conference on Electrical Engineering/electronics. IEEE.Google ScholarGoogle ScholarCross RefCross Ref
  30. Wu, H., & Yang, L. (2020). User-defined gestures for dual-screen mobile interaction. International Journal of Human–Computer Interaction, 36(10), 978-992.Google ScholarGoogle ScholarCross RefCross Ref
  31. Chan, E., Seyed, T., Stuerzlinger, W., Yang, X. D., & Maurer, F. (2016, May). User elicitation on single-hand microgestures. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 3403-3414).Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Babic, T., Reiterer, H., & Haller, M. (2017, October). GestureDrawer: one-handed interaction technique for spatial user-defined imaginary interfaces. In Proceedings of the 5th Symposium on Spatial User Interaction (pp. 128-137).Google ScholarGoogle Scholar
  33. Borah, P. P., & Sorathia, K. (2019, March). Natural and intuitive deformation gestures for one-handed landscape mode interaction. In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction (pp. 229-236).Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Sei, Y., Funakoshi, M., & Shizuki, B. (2019, December). Expanding One-Handed Touch Input Vocabulary Using Index Finger on and Above Back-of-Device. In Proceedings of the 31st Australian Conference on Human-Computer-Interaction (pp. 585-589).Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Cannanure, V., Xiang'Anthony'Chen, & Mankoff, J. (2016, June). Twist'n'Knock: A One-handed Gesture for Smart Watches. In Graphics Interface (pp. 189-193).Google ScholarGoogle Scholar
  36. Wobbrock, J. O., Morris, M. R., & Wilson, A. D. (2009, April). User-defined gestures for surface computing. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1083-1092).Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological review, 63(2), 81.Google ScholarGoogle Scholar

Index Terms

  1. “Just Like Blooming Fireworks, And Match With Function Perfectly”: Explore and Evaluate User-Defined One-Handed Gestures of Smartwatch

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
      April 2023
      3914 pages
      ISBN:9781450394222
      DOI:10.1145/3544549

      Copyright © 2023 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 April 2023

      Check for updates

      Qualifiers

      • Work in Progress
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate6,164of23,696submissions,26%
    • Article Metrics

      • Downloads (Last 12 months)249
      • Downloads (Last 6 weeks)24

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    View Full Text

    HTML Format

    View this article in HTML Format .

    View HTML Format