skip to main content
10.1145/1358628.1358676acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Natural interaction sensitivetable

Published:05 April 2008Publication History

ABSTRACT

The SensitiveTable is a large multi-touch display that detects and tracks hands and objects in contact with it at 60 frames per second with a resolution of about 1.5 millimeters. A software application framework allows the creation of custom natural experiences. The table is equipped with array microphones and RFID antennas on its edges. The table runs a speaker independent speech recognition engine, based on a very small vocabulary, that is invoked only in specific circumstances. RFID tagged objects are used to populate the interface with contents, activate functions and authenticate users. Due to its analytical nature (high resolution and multi-point gestures), the table in the public space is used mostly as a form of digital mediation between two or more persons (e.g. consultant and customer): the expert can lead the novice through the more complex and less intuitive dynamics of interaction. We provide the SensitiveTable as an example of analytical natural interaction.

Skip Supplemental Material Section

Supplemental Material

1358676.mp4

mp4

57.2 MB

References

  1. http://naturalinteraction.orgGoogle ScholarGoogle Scholar
  2. Alex Pentland, Smart rooms, Scientific American, Vol. 274, No. 4, April 1996.Google ScholarGoogle Scholar
  3. Hiroshi Ishii and Brygg Ullmer, Tangible bits: towards seamless interfaces between people, bits and atoms, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, USA, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Flavia Sparacino, Sto(ry)chastics: a Bayesian network architecture for user modeling and computational storytelling for interactive spaces, Proceedings of Ubicomp, The Fifth International Conference on Ubiquitous Computing, Seattle, USA, 2003.Google ScholarGoogle Scholar
  5. Paul Dourish, A foundational framework for situated computing, CHI 2000 Workshop on Situated Computing, 2000.Google ScholarGoogle Scholar
  6. http://ioagency.comGoogle ScholarGoogle Scholar
  7. http://www.microsoft.com/surfaceGoogle ScholarGoogle Scholar
  8. http://www.perceptivepixel.comGoogle ScholarGoogle Scholar
  9. Carlo Colombo, Alberto Del Bimbo and Alessandro Valli, Visual capture and understanding of hand pointing actions in a 3D environment, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 33(4), 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Natural interaction sensitivetable

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '08: CHI '08 Extended Abstracts on Human Factors in Computing Systems
      April 2008
      2035 pages
      ISBN:9781605580128
      DOI:10.1145/1358628

      Copyright © 2008 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 April 2008

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate6,164of23,696submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader