Skip to main content
Log in

Interaction on-the-go: a fine-grained exploration on wearable PROCAM interfaces and gestures in mobile situations

  • Long paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

Wearable projector and camera (PROCAM) interfaces, which provide a natural, intuitive and spatial experience, have been studied for many years. However, existing hand input research into such systems revolved around investigations into stable settings such as sitting or standing, not fully satisfying interaction requirements in sophisticated real life, especially when people are moving. Besides, increasingly more mobile phone users use their phones while walking. As a mobile computing device, the wearable PROCAM system should allow for the fact that mobility could influence usability and user experience. This paper proposes a wearable PROCAM system, with which the user can interact by inputting with finger gestures like the hover gesture and the pinch gesture on projected surfaces. A lab-based evaluation was organized, which mainly compared two gestures (the pinch gesture and the hover gesture) in three situations (sitting, standing and walking) to find out: (1) How and to what degree does mobility influence different gesture inputs? Are there any significant differences between gesture inputs in different settings? (2) What reasons cause these differences? (3) What do people think about the configuration in such systems and to what extent does the manual focus impact such interactions? From qualitative and quantitative points of view, the main findings imply that mobility impacts gesture interactions in varying degrees. The pinch gesture undergoes less influence than the hover gesture in mobile settings. Both gestures were impacted more in walking state than in sitting and standing states by all four negative factors (lack of coordination, jittering hand effect, tired forearms and extra attention paid). Manual focus influenced mobile projection interaction. Based on the findings, implications are discussed for the design of a mobile projection interface with gestures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

Notes

  1. Based on negative ranks.

  2. Based on positive ranks.

References

  1. Albinsson, P.A., Zhai, S.: High precision touch screen interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2003), pp. 105–112 (2003)

  2. Barnard, L., Yi, J.S., Jacko, J.A., Sears, A.: An empirical comparison of use-in-motion evaluation scenarios for mobile computing devices. Int. J. Hum. Comput. Stud. 62(4), 487–520 (2005)

    Article  Google Scholar 

  3. Bradski, G.R.: Computer vision face tracking for use in a perceptual user interface. In: Proceedings the IEEE Workshop on Application of Computer Vision (WACV 1998), pp. 214–219 (1998)

  4. Buxton, W.: (1990) A three-state model of graphical input. In: Proceedings of 3rd IFIP International Conference on Human–Computer Interaction (INTERACT 1990), pp. 449–456

  5. Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., Twombly, X.: Vision-based hand pose estimation: A review. Comput. Vis. Image Underst. 108(1–2), 52–73 (2007)

    Article  Google Scholar 

  6. Esenther, A., Ryall, K.: Fluid DTMouse: better mouse support for touch-based interactions. In: Proceedings of the Working Conference on Advanced Visual Interfaces (AVI 2006), pp. 112–115 (2006)

  7. Grant, D.A.: The latin square principle in the design and analysis of psychological experiments. Psychol. Bull. 45(5), 427 (1948)

    Article  Google Scholar 

  8. Harrison, C., Benko, H., Wilson, A.D.: OmniTouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST 2011), pp. 441–450 (2011)

  9. Kane, S.K., Wobbrock, J.O., Smith, I.E.: Getting off the treadmill: evaluating walking user interfaces for mobile devices in public spaces. In: Proceedings of the 10th International Conference on Human–Computer Interaction with Mobile Devices and Services, pp. 109–118. ACM, New York (2008)

  10. Kjeldskov, J., Stage, J.: New techniques for usability evaluation of mobile systems. Int. J. Hum. Comput. Stud. 60(5), 599–620 (2004)

    Article  Google Scholar 

  11. Konishi, T., Tajimi, K., Sakata, N., Nishida, S.: Projection stabilizing method for palm-top display with wearable projector. In: Proceedings of the 13th International Symposium on Wearable Computers (ISWC 2009), pp. 13–20 (2009)

  12. Kurata, T., Sakata13, N., Kourogi, M., Okuma, T., Ohta, Y.: Interaction using nearby-and-far projection surfaces with a body-worn ProCam system. In: Proceedings of the Engineering Reality of Virtual Reality in the 20th Annual IS&T/SPIE Symposium on Electronic Imaging (EI 2008), vol. 6804-16 (2008)

  13. Liao, C., Tang, H., Liu, Q., Chiu, P., Chen, F.: FACT: fine-grained cross-media interaction with documents via a portable hybrid paper-laptop interface. In: Proceedings of the International Conference on Multimedia (MM 2010), pp. 361–370 (2010)

  14. Likert, R.: A technique for the measurement of attitudes. Arch. Psychol. 22(140), 1–55 (1932)

    Google Scholar 

  15. Loclair, C., Gustafson, S., Baudisch, P.: PinchWatch: a wearable device for one-handed microinteractions. In: Proceedings of the 12th International Conference on MobileHCI Workshop on Ensembles of On-Body Devices (MobileHCI 2010 Workshop) (2010)

  16. MacKay, B., Dearman, D., Inkpen, K., Watters, C.: Walk’n scroll: a comparison of software-based navigation techniques for different levels of mobility. In: Proceedings of the 7th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 183–190. ACM, New York (2005)

  17. McFarlane, D.C., Wilder, S.M.: Interactive dirt: increasing mobile work performance with a wearable projector–camera system. In: Proceedings of the 11th International Conference on Ubiquitous Computing (UBICOMP 2009), pp. 205–214 (2009)

  18. Mistry, P., Maes, P., Chang, L.: WUW-wear ur world: a wearable gestural interface. In: Proceedings of the 27th International Conference on Extended Abstracts on Human Factors in Computing Systems (CHI EA 2009), pp. 4111–4116 (2009)

  19. Mustonen, T., Olkkonen, M., Hakkinen, J.: Examining mobile phone text legibility while walking. In: Extended Abstracts on Human Factors in Computing Systems (CHI ’04), pp. 1243–1246. ACM, New York (2004)

  20. Ni, T., Karlson, A.K., Wigdor, D.: AnatOnMe: facilitating doctor-patient communication using a projection-based handheld device. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 3333–3342 (2011)

  21. Nicolau, H., Jorge, J.: Touch typing using thumbs: understanding the effect of mobility and hand posture. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2683–2686. ACM, New York (2012)

  22. Ota, S., Takegawa, Y., Terada, T., Tsukamoto, M.: A method for wearable projector selection that considers the viewability of projected images. Comput. Entertain. Theor. Pract. Comput. Appl. Entertain. 8(3), pp. 1–17 (2010) (article no. 17)

  23. Potter, R. L., Weldon, L. J., Shneiderman, B.: Improving the accuracy of touch screens: an experimental evaluation of three strategies. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1988), pp. 27–32 (1988)

  24. Rogers, M.S., Barr, A.B., Kasemsontitum, B., Rempel, D.M.: A three-dimensional anthropometric solid model of the hand based on landmark measurements. Ergonomics 51(4), 511–526 (2008)

    Article  Google Scholar 

  25. Rukzio, E., Holleis, P., Gellersen, H.: Personal projectors for pervasive computing. IEEE Pervasive Comput. 11((Issue 2)), 30–37 (2012)

    Article  Google Scholar 

  26. Schildbach, B., Rukzio, E.: Investigating selection and reading performance on a mobile phone while walking. In: Proceedings of the 12th International Conference on Human–Computer Interaction with Mobile Devices and Services, pp. 93–102. ACM, New York (2010)

  27. Siek, K., Rogers, Y., Connelly, K.: Fat finger worries: How older and younger users physically interact with PDAs. In: Costabile, M.F., Paternò, F. (eds.) Human–Computer Interaction (INTERACT 2005), vol. 3585, pp. 267–280 (2005)

  28. Tajimi, K., Uemura, K., Kajiwara, Y., Sakata, N., Nishida, S.: Stabilization method for floor projection with a hip-mounted projector. In: Proceedings of 20th International Conference on Artificial Reality and Telexistence (ICAT 2010), vol. 10, pp. 77–83 (2010)

  29. Vogel, D., Baudisch, P.: Shift: a technique for operating pen-based interfaces using touch. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI 2007), vol. 1, pp. 657–666 (2007)

  30. Wachs, J.P., Kölsch, M., Stern, H., Edan, Y.: Vision-based hand–gesture applications. Commun. ACM 54(2), 60–71 (2011)

    Article  Google Scholar 

  31. Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C.: Lucid touch: a see-through mobile device. In: Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology (UIST 2007), pp. 269–278 (2007)

  32. Wilson, A.D.: Robust computer vision-based detection of pinching for one and two-handed gesture input. In: Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology (UIST 2006), pp. 255–258 (2006)

  33. Wilson, A. D., Benko, H.: Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: Proceedings of the 23rd Annual ACM Symposium on User Interface Software and Technology, pp. 273–282. ACM, New York (2010)

  34. Winkler, C., Seifert, J., Dobbelstein, D., Rukzio, E.: Pervasive information through constant personal projection: the ambient mobile pervasive display (AMP-D). In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 4117–4126. ACM, New York (2014)

  35. Hollander, M., Wolfe, D.A., and Chicken, E. Nonparametric statistical methods, 3rd edn, pp. 59–62. Wiley, New York (2014)

  36. Zhou, Y., David, B., Chalon, R.: PlayAllAround: wearable one-hand gesture input and scalable projected interfaces. In: Proceedings of the Conference on Ergonomie and Interaction Homme–Machine, p. 105. ACM, New York (2012)

  37. Zhou, Y., Xu, T., David, B., Chalon, R.: Innovative wearable interfaces: an exploratory analysis of paper-based interfaces with camera-glasses device unit. Pers. Ubiquitous Comput. 18(4), 835–849 (2014)

    Article  Google Scholar 

  38. Zhou, Y., Xu, T., David, B., Chalon, R.: Where is mobile projection interaction going? The past, present and future of the mobile projected interface. In: Kurosu, M. (ed.) Human–Computer Interaction. Applications and Services, pp. 189–198. Springer, Berlin (2014)

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Yun Zhou or Tao Xu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, Y., Xu, T., David, B. et al. Interaction on-the-go: a fine-grained exploration on wearable PROCAM interfaces and gestures in mobile situations. Univ Access Inf Soc 15, 643–657 (2016). https://doi.org/10.1007/s10209-015-0448-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-015-0448-6

Keywords

Navigation