ABSTRACT
Human-computer interaction using large-format displays is an active area of research that focuses on how humans can better work with computers or other machines. In order for this to happen, there must be an enabling technology that creates the interface between man and machine. Touch capabilities in a large-format display would be advantageous as a large display area is informationally dense and touch provides a natural, life-size interface to that information. This paper describes a new enabling technology in the form of a camera-based man-machine input device which uses smart cameras to analyze a scene directly in front of a large-format computer display. The analysis determines where a user has touched the display, and then treats that information as a mouse click, thereby controlling the computer. Significant technological problems have been overcome to make the system robust enough for commercialization. The paper also describes camera-based system architecture and presents some interesting advantages as well as new capabilities. The technology is ideally suited to large-format computer displays, thus creating a very natural interface with familiar usage paradigms for human-computer interaction.
- P. Wellner, "Interacting with paper on the DigitalDesk," Communication of the ACM, no. 7, pp. 87--96, July, 1993. Google ScholarDigital Library
- W. Freeman, D. Anderson, and P. Beardsley, "Computer Vision for Interactive Computer Graphics," IEEE Computer Graphics and Applications, pp. 42--53, May--June 1998. Google ScholarDigital Library
- F. Quek, T. Mysliwiec, and M. Zhao, "Finger mouse: A freehand pointing interface," International Workshop on Automatic Face- and Gesture-Recognition, Zurich, 1995.Google Scholar
- J. Crowley, J. Bérard, and J. Coutaz, "Finger Tracking as an Input Device for Augmented Reality," International Workshop on Gesture and Face Recognition, Zurich, June 1995.Google Scholar
- C. Jennings, "Robust finger tracking with multiple cameras," IEEE International Workshop Recognition, Analysis and Tracking of Faces and Gestures in Real-Time Systems, pp. 152--160, Corfu, Greece, September 1999. Google ScholarDigital Library
- J. Crowley, J. Coutaz, and F. Berard, "Things that see," Communications of the ACM, pp. 54--64, March 2000. Google ScholarDigital Library
- C. Hardenberg, and F. Berard, "Bare-hand human-computer interaction," Proceedings of the ACM Workshop on Perceptive User Interfaces, Orlando, Florida November 2001. Google ScholarDigital Library
- Z. Zhang, Y. Wu, Y. Shan, and S. Shafer, "Visual panel: Virtual mouse keyboard and 3d controller with an ordinary piece of paper," Workshop on Perceptive User Interfaces, ACM Digital Library, November 2001. Google ScholarDigital Library
- Z. Zhang, "Vision-based interaction with fingers and papers", Proceedings of the International Symposium on the CREST Digital Archiving Project, Tokyo, Japan, pp. 83--106, May 23--24, 2003.Google Scholar
- F. Berard, "The magic-table: Computer vision-based augmentation of a whiteboard for creative meetings," Proceedings of the IEEE International Conference in Computer Vision, Nice, France, October 12, 2003.Google Scholar
- T. Starner, B. Leibe, D. Minnen, T. Westyn, A. Hurst, and J. Weeks, "The perceptive workbench: Computer-vision-based gesture tracking, object tracking, as 3D reconstruction for augmented desks," Machine Vision and Application, Springer-Verlag, pp. 59--71, 2003.Google Scholar
- B. Stenger, A. Thayananthan, P. H. S. Torr, and R. Cipolla. "Filtering using a tree based estimator," Proc. 9th IEEE International Conference on Computer Vision, Vol. II, pages 1063--1070, Nice, France, October 2003. Google ScholarDigital Library
- J. Letessier, and F Bérard, "Visual Tracking of Bare Fingers for Interactive Surfaces," Proceeding of User Interface Software and Technology (UIST), Santa Fe, New Mexico, October 2004. Google ScholarDigital Library
- J. Rekimoto and N. Matsushita, "Perceptual Surfaces: Towards a Human and Object Sensitive Interactive Display," Workshop on Perceptual User Interfaces, Banff, Canada, 1997.Google Scholar
- A. Wilson, "TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction," International Conference on Multimodal Interfaces, State College, Pennsylvania, October 2004. Google ScholarDigital Library
- C. Shen, B. Wang, S. Oldridge, F. Vogt and S. Fels, "RemoteEyes: A Remote Low-Cost Position Sensing Infrastructure for Ubiquitous Computing," Proceedings of the First International Workshop on Networked Sensing Systems (INSS2004), June 2004, Tokyo, Japan.Google Scholar
- Y. Sumi., S. Ito, T. Matsuguchi, S Fels. and K. Mase.: "Collaborative capturing and interpretation of interactions," Pervasive 2004 Workshop on Memory and Sharing of Experiences, Vienna, Austria, April 2004.Google Scholar
- A. Wilson, "Smart cameras embed processor power," Vision Systems Design, pp. 95--99, September 2003.Google Scholar
- W. Wolf, B. Ozer, T. Lu, "Smart cameras as embedded systems," IEEE Computer Magazine, pp. 48--53, September 2002. Google ScholarDigital Library
- G. Morrison, M. Singh, and D. Holmgren, "Machine vision passive touch technology for interactive displays," Proceedings of Society for Information Display 2001 International Symposium, Boston, MA, June 2001.Google Scholar
- I. MacKenzie and C. Ware, "Lag as a determinant of human performance in interactive systems," Conference on Human Factors in Computing Systems, pp. 448--493, New York, 1993. Google ScholarDigital Library
- C. Ware and R. Balakrishnan, "Reaching for objects in VR displays: Lag and frame rate," ACM Transactions on Computer-Human Interaction, vol. 1, no. 4, pp. 331--356, 1994. Google ScholarDigital Library
- G. Morrison and D. Holmgren, "Toward a touch sensitive display wall," Proceedings of Society for Information Display 2003 International Symposium, Baltimore, MD, May 2003.Google Scholar
- V. Cheng, N. Kehtarnavaz, "A smart camera application: DSP- based people detection and tracking," SPIE Journal of Electronic Imaging, July 2000.Google Scholar
- J. Summet, M. Flagg, J. Rehg, G. Corso, and G. Abowd. IEEE, "Increasing the Usability of Virtual Rear Projection," International Workshop on Projector-Camera Systems, Nice, France, October 2003.Google Scholar
- T. Funkhouser and K. Li, "Large Format Displays," Computer Graphics and Applications, July 2000 (Guest editor introduction to special issue). Google ScholarDigital Library
- A CMOS camera-based man-machine input device for large-format interactive displays
Recommendations
A Camera-Based Input Device for Large Interactive Displays
Humanýcomputer interaction using large-format displays is an active area of research that focuses on how humans can better work with computers or other machines. For this to happen, there must be an enabling technology that creates the interface between ...
3D Freehand Gestural Navigation for Interactive Public Displays
Users increasingly expect more-interactive experiences with public displays for applications including learning, gaming, urban visualization, and planning. However, user interaction with applications on public displays is challenging and often doesn't ...
Demonstrating Proxemic Cursor Input for Touchless Displays
SUI '23: Proceedings of the 2023 ACM Symposium on Spatial User InteractionTouchless gesture interfaces often use cursor-based interactions, where widgets are targeted by a movable cursor and activated with a mid-air gesture. Proxemic cursor interactions are a novel alternative that facilitate faster selection without the need ...
Comments