Skip to main content
Log in

Interaction with large ubiquitous displays using camera-equipped mobile phones

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

In the ubiquitous computing environment, people will interact with everyday objects (or computers embedded in them) in ways different from the usual and familiar desktop user interface. One such typical situation is interacting with applications through large displays such as televisions, mirror displays, and public kiosks. With these applications, the use of the usual keyboard and mouse input is not usually viable (for practical reasons). In this setting, the mobile phone has emerged as an excellent device for novel interaction. This article introduces user interaction techniques using a camera-equipped hand-held device such as a mobile phone or a PDA for large shared displays. In particular, we consider two specific but typical situations (1) sharing the display from a distance and (2) interacting with a touch screen display at a close distance. Using two basic computer vision techniques, motion flow and marker recognition, we show how a camera-equipped hand-held device can effectively be used to replace a mouse and share, select, and manipulate 2D and 3D objects, and navigate within the environment presented through the large display.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Norman DA (1998) The invisible computer. MIT Press, Cambridge

  2. Wisneski C, Ishii H, Dahley A, Gorbet M, Brave S, Ullmer B, Yarin P (1998) Ambient displays: turning architectural space into an interface between people and digital information. In: Proceedings of the first international workshop on cooperative buildings (CoBuild ‘98), pp 22–32

  3. Lashina T (2004) Intelligent bathroom. In: European Symposium on Ambient Intelligence, Eindhoven, Netherlands

  4. Fitzmaurice GW, Zhai S, Chignell MH (1993) Virtual reality for palmtop computers. ACM Trans Info Syst 11(3):197–218

    Article  Google Scholar 

  5. Watsen K, Darken RP, Capps M (1999) A handheld computer as an interaction device to a virtual environment. In: Proceedings of the third immersive projection technology workshop

  6. Kukimoto N, Furusho Y, Nonaka J, Koyamada K, Kanazawa M (2003) Pda-based visualization control and annotation interface for virtual environment. In: Proceeding of 3rd IASTED international conference visualization, image and image processing

  7. Mantyla V-M, Mantyjarvi J, Seppanen T, Tuulari E (2000) Hand gesture recognition of a mobile device user. In: Proceedings of the IEEE international conference on multi-media and expo, pp 281–284

  8. Bayon V, Griffiths G (2003) Co-located interaction in virtual environments via de-coupled interfaces. In: Proceedings of HCI international, pp 1391–1395

  9. Hachet M, Pouderoux J, Guitton P (2005) A camera-based interface for interaction with mobile handheld computers. In: Proceedings of the symposium on interactive 3D graphics and games, pp 65-72

  10. Lourakis M, Argyros A (2005) Efficient, causal camera tracking in unprepared environments. Comput Vis Image Underst 99(2):259–290

    Article  Google Scholar 

  11. Wagner D, Schmalstieg D (2003) First steps towards handheld augmented reality. In: Proceedings of the 7th international conference on wearable computers, p 127

  12. Paelke V, Reimann C, Stichling D (2004) Foot-based mobile interaction with games. In: ACM SIGCHI international conference on advances in computer entertainment technology (ACE), pp 321–324

  13. Hachet M, Kitamura Y (2005) 3D interaction with and from handheld computers. In: Proceedings of the IEEE VR 2005 workshop: new directions in 3D user interfaces, pp 11–14

  14. Hansen TR, Eriksson E, Lykke-Olesen A (2005) Mixed interaction space: Designing for camera based interaction with mobile devices. In: Proceedings of ACM CHI 2005 conference on human factors in computing systems, pp 1933–1936

  15. Kruppa M, Krüger A (2003) Concepts for a combined use of personal digital assistants and large remote displays. In: Proceedings of simulation und visualisierung, SCS Publishing House e.V, San Diego, pp 349–362

  16. Ballagas R, Rohs M, Sheridan JG (2005) Sweep and point & shoot: phonecam-based interactions for large public displays. In: Conference on human factors in computing systems, pp 1200–1203

  17. Rohs M, Zweifel P (2005) A conceptual framework for camera phone-based interaction techniques (PERVASIVE 2005). Lect Notes Comp Sci 3468:171–189

    Article  Google Scholar 

  18. Madhavapeddy A, Scott D, Sharp R, Upton E, The Spotcode project website. http://www.cl.cam.ac.uk/Research/SRG/netos/uid/spotcode.html

  19. Miyahara K, Inoue H, Tsunesada Y, Sugimoto M (2005) Intuitive manipulation techniques for projected displays of mobile devices. In: Conference on human factors in computing systems, pp 1657–1660

  20. Wu M, Balakrishnan R (2003) Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. In: Proceedings of the ACM UIST, pp 193–202

  21. Shen C, Vernier F, Forlines C, Ringel M (2004) DiamondSpin: an extensible toolkit for around the table interaction. In: Conference on human factors in computing systems, pp. 167–174

  22. Cavens D, Vogt F, Fels S, Meitner M (2002) Interacting with the big screen: pointers to ponder. In: Conference on human factors in computing systems, pp 678–679

  23. Regenbrecht H, Haller M, Hauber J, Billinghurst M (2006) Carpeno: interfacing remote collaborative virtual environments with table-top interaction. Virtual Real Syst Dev Appl 10(2):95–107

    Article  Google Scholar 

  24. Maringelli F, Mccarthy J, Slater M, Steed A (1998) The influence of body movement on subjective presence in virtual environments. Hum Factors 40(3):469–477

    Article  Google Scholar 

  25. Nokia 6630 symbian OS phone. Avaiable at http://www.symbian.com/phones/nokia_6630.html

  26. Kato H, Billinghurst M (1999) Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In: Proceedings of the 2nd international workshop on augmented reality, pp 85–94

  27. NextWindow 2100 series touch frame. Available at http://www.nextwindow.com/products/2100

  28. Bouguet JY (2003) Pyramidal implementation of the Lucas Kanade feature tracker description of the algorithm Intel Corporation, Intel Corporation, Microprocessor Research Labs

  29. Shi J, Tomasi C (1994) Good features to track. In: Proceedings of IEEE conference on computer vision and pattern recognition, pp 593–600

  30. Chen M, Mountford SJ, Sellen A (1988) A study in interactive 3D rotation using 2D control devices. In: Proceedings of SIGGRAPH’88, pp 121–129

  31. Jacob I, Oliver J (1995) Evaluation of techniques for specifying 3D rotations with 2D input device. In: Proceedings of human computer interaction, pp 63–76

  32. Nintendo Wii. Available at http://wii.nintendo.com

  33. Abawi D, Bienwald J, Dorner R (2004) Accuracy in optical tracking with fiducial markers: An accuracy function for ARToolKit. In: Proceedings of IEEE and ACM international symposium on mixed and augmented reality, pp 260–261

  34. Yim S, Hwang J, Choi S, Kim GJ (2007) Image browsing in mobile device using user motion tracking. In: Proceedings of the international symposium on ubiquitous virtual reality

Download references

Acknowledgments

The Lucas–Kanade feature tracker for Symbian OS was kindly provided by ACID (Australian CRC for Interaction Design) for our implementation results. This research is financially supported by the Ministry of Knowledge Economy (MKE) and Korea Institute for Advancement in Technology (KIAT) through the Workforce Development Program in Strategic Technology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gerard J. Kim.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Jeon, S., Hwang, J., Kim, G.J. et al. Interaction with large ubiquitous displays using camera-equipped mobile phones. Pers Ubiquit Comput 14, 83–94 (2010). https://doi.org/10.1007/s00779-009-0249-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-009-0249-0

Keywords

Navigation