skip to main content
10.1145/1107548.1107551acmotherconferencesArticle/Chapter ViewAbstractPublication Pagessoc-eusaiConference Proceedingsconference-collections
Article

CHIL computing to overcome techno-clutter

Published: 12 October 2005 Publication History

Abstract

After building computers that paid no intention to communicating with humans, we have in recent years developed ever more sophisticated interfaces that put the "human in the loop" of computers. These interfaces have improved usability by providing more appealing output (graphics, animations), more easy to use input methods (mouse, pointing, clicking, dragging) and more natural interaction modes (speech, vision, gesture, etc.). Yet the productivity gains that have been promised have largely not been seen and human-machine interaction still remains a partially frustrating and tedious experience, full of techno-clutter and excessive attention required by the technical artifact.In this talk, I will argue, that we must transition to a third paradigm of computer use, in which we let people interact with people, and move the machine into the background to observe the humans' activities and to provide services implicitly, that is, -to the extent possible- without explicit request. Putting the "Computer in the Human Interaction Loop" (CHIL), instead of the other way round, however, brings formidable technical challenges. The machine must now always observe and understand humans, model their activities, their interaction with other humans, the human state as well as the state of the space they are in, and finally, infer intentions and needs. From a perceptual user interface point of view, we must process signals from sensors that are always on, frequently inappropriately positioned, and subject to much greater variablity. We must also not only recognize WHAT was seen or said in a given space, but also a broad range of additional information, such as the WHO, WHERE, HOW, TO WHOM, WHY, WHEN of human interaction and engagement.In this talk, I will describe a variety of multimodal interface technologies that we have developed to answer these questions and some preliminary CHIL type services that take advantage of such perceptual interfaces.
  1. CHIL computing to overcome techno-clutter

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    sOc-EUSAI '05: Proceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies
    October 2005
    316 pages
    ISBN:1595933042
    DOI:10.1145/1107548
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 October 2005

    Permissions

    Request permissions for this article.

    Check for updates

    Qualifiers

    • Article

    Conference

    sOc-EUSAI05
    sOc-EUSAI05: Smart Objects & Ambient Intelligence
    October 12 - 14, 2005
    Grenoble, France

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 82
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 28 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media