skip to main content
10.1145/2582051.2582090acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahConference Proceedingsconference-collections
research-article

Anywhere surface touch: utilizing any surface as an input area

Published:07 March 2014Publication History

ABSTRACT

The current trend towards smaller and smaller mobile devices may cause considerable difficulties in using them. In this paper, we propose an interface called Anywhere Surface Touch, which allows any flat or curved surface in a real environment to be used as an input area. The interface uses only a single small camera and a contact microphone to recognize several kinds of interaction between the fingers of the user and the surface. The system recognizes which fingers are interacting and in which direction the fingers are moving. Additionally, the fusion of vision and sound allows the system to distinguish the contact conditions between the fingers and the surface. Evaluation experiments showed that users became accustomed to our system quickly, soon being able to perform input operations on various surfaces.

References

  1. Ahmad, F., and Musilek, P. Ubihand: a wearable input device for 3d interaction. In ACM SIGGRAPH '06 Research posters (2006). Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Amento, B., Hill, W., and Terveen, L. The sound of one hand: A wrist-mounted bio-acoustic fingertip gesture interface. In CHI '02 Extended Abstracts (2002), 724--725. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Atrey, P., Hossain, M., El Saddik, A., and Kankanhalli, M. Multimodal fusion for multimedia analysis: a survey. Multimedia Systems 16, 6 (2010), 345--379.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Deyle, T., Palinko, S., Poole, E., and Starner, T. Hambone: A bio-acoustic gesture interface. In Proc. ISWC '07 (2007), 3--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Harrison, C., Benko, H., and Wilson, A. D. Omnitouch: wearable multitouch interaction everywhere. In Proc. UIST '11 (2011), 441--450. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Harrison, C., and Hudson, S. E. Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces. In Proc. UIST '08 (2008), 205--208. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Harrison, C., and Hudson, S. E. Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices. In Proc. UIST '09 (2009), 121--124. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Harrison, C., and Hudson, S. E. Minput: enabling interaction on small mobile devices with high-precision, low-cost, multipoint optical tracking. In Proc. CHI '10 (2010), 1661--1664. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Harrison, C., Tan, D., and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI '10 (2010), 453--462. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., and Olivier, P. Digits: freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proc. UIST '12 (2012), 167--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Mistry, P., and Maes, P. Sixthsense: a wearable gestural interface. In ACM SIGGRAPH ASIA '09 Sketches (2009), 11:1--11:1. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Nakatsuma, K., and Shinoda, H. Wristband-shaped input interface using user's back of hand. In WHC '11 Demo (2011).Google ScholarGoogle Scholar
  13. Nanayakkara, S., Shilkrot, R., Yeo, K. P., and Maes, P. Eyering: A finger worn input device for seamless interactions with our surroundings. In Proc. AH '13 (2013), 13--20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Niikura, T., Hirobe, Y., Cassinelli, A., Watanabe, Y., Komuro, T., and Ishikawa, M. In-air typing interface for mobile devices with vibration feedback. In ACM SIGGRAPH '10 Emerging Technologies (2010), 15:1--15:1. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Roeber, H., Bacus, J., and Tomasi, C. Typing in thin air: the canesta projection keyboard - a new method of interaction with electronic devices. In CHI '03 Extended Abstracts (2003), 712--713. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Shiratori, T., Park, H. S., Sigal, L., Sheikh, Y., and Hodgins, J. K. Motion capture from body-mounted cameras. ACM Trans. Graph. 30, 4 (2011), 31:1--31:10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Villar, N., Izadi, S., Rosenfeld, D., Benko, H., Helmes, J., Westhues, J., Hodges, S., Ofek, E., Butler, A., Cao, X., and Chen, B. Mouse 2.0: multi-touch meets the mouse. In Proc. UIST '09 (2009), 33--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Watanabe, Y., Hatanaka, T., Komuro, T., and Ishikawa, M. Human gait estimation using a wearable camera. In Proc. WACV '11 (2011), 276--281. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Wilson, A. D. Playanywhere: a compact interactive tabletop projection-vision system. In Proc. UIST '05 (2005), 83--92. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Yatani, K., and Truong, K. N. Bodyscope: a wearable acoustic sensor for activity recognition. In Proc. UbiComp '12 (2012), 341--350. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Anywhere surface touch: utilizing any surface as an input area

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        AH '14: Proceedings of the 5th Augmented Human International Conference
        March 2014
        249 pages
        ISBN:9781450327619
        DOI:10.1145/2582051

        Copyright © 2014 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 7 March 2014

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate121of306submissions,40%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader