Abstract
This work regards fingertip contact detection and localization upon planar surfaces, for the purpose of providing interactivity in augmented, interactive displays that are implemented upon these surfaces. The proposed approach differs from the widely employed approach where user hands are observed from above, in that user hands are imaged laterally. An algorithmic approach for the treatment of the corresponding visual input is proposed. The proposed approach is extensively evaluated and compared to the top view approach. Advantages of the proposed approach include increased sensitivity, localization accuracy, scalability, as well as, practicality and cost efficiency of installation.













Similar content being viewed by others
References
Agarwal A, Izadi S, Chandraker M, Blake A (2007) High precision multi-touch sensing on surfaces using overhead cameras. In: IEEE international workshop on horizontal interactive human-computer systems, pp 197–200
Benko H, Jota R, Wilson A (2012) Miragetable: freehand interaction on a projected augmented reality tabletop. In: SIGCHI conference on human factors in computing systems, pp 199–208
Bhalla M, Bhalla A (2010) Article: comparative study of various touchscreen technologies. Int J Comput Appl 6(8):12–18
Bimber O, Raskar R (2005) Spatial augmented reality: merging real and virtual worlds. A. K. Peters, Ltd., Natick
Bishop CM (2006) Pattern recognition and machine learning. Springer
Dietz P, Leigh D (2001) Diamondtouch: A multi-user touch technology. In: ACM symposium on user interface software and technology, pp 219–226
Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395
Gesture Works http://gestureworks.com/
Han J (2005) Low-cost multi-touch sensing through frustrated total internal reflection. In: ACM symposium on user interface software and technology, pp 115–118
von Hardenberg C, Berard F (2001) Bare-hand human-computer interaction. In: Workshop on perceptive user interfaces. ACM, New York, NY, USA, pp 1–8
Harrison C, Benko H, Wilson A (2011) Omnitouch: wearable multitouch interaction everywhere. In: ACM symposium on user interface software and technology, pp 441–450
Hartmann G, Wunsche B (2012) A virtual touchscreen with depth recognition. In: Australasian user interface conference, pp 39–48
Hilliges O, Kim D, Izadi S, Weiss M, Wilson A (2012) Holodesk: direct 3D interactions with a situated see-through display. In: Human factors in computing systems, pp 2421–2430
Jones B, Sodhi R, Campbell R, Garnett G, Bailey B (2010) Build your world and play in it: interacting with surface particles on complex objects. In: IEEE international symposium on mixed and augmented reality, pp 165–174
Jones B, Sodhi R, Murdock M, Mehra R, Benko H, Wilson A, Ofek E, MacIntyre B, Raghuvanshi N, Shapira L (2014) RooMalive: magical experiences enabled by scalable, adaptive projector-camera units. In: ACM symposium on user interface software and technology, pp 637–644
Katz I, Gabayan K, Aghajan H (2007) A multi-touch surface using multiple cameras. Springer, pp 97–108
Kim J, Park J, Kim H, Lee C (2007) HCI (human computer interaction) using multi-touch tabletop display. In: IEEE pacific rim conference on communications, computers and signal processing, pp 391–394
Kjeldsen R, Pinhanez C, Pingali G, Hartman J, Levas T, Podlaseck M (2002) Interacting with steerable projected displays. In: Automatic face and gesture recognition, pp 402–410
Klompmaker F, Fischer H, Jung H (2012) Authenticated tangible interaction using RFID and depth-sensing cameras. In: International conference on advances in computer-human interactions, pp 141–144
Klompmaker F, Nebe K, Fast A (2012) dSensingNI: a framework for advanced tangible interaction using a depth camera. In: International conference on tangible, embedded and embodied interaction, pp 217–224
Koutlemanis P, Ntelidakis A, Zabulis X, Grammenos D, Adami I (2013) A steerable multitouch display for surface computing and its evaluation. Int J Artif Intell Tools 22(06):13600,161
Leibe B, Starner T, Ribarsky W, Wartell Z, Krum D, Weeks J, Singletary B, Hodges L (2000) Toward spontaneous interaction with the perceptive workbench. IEEE Comput Graph Appl 20(6):54–65
Margetis G, Zabulis X, Ntoa S, Koutlemanis P, Papadaki E, Antona M, Stephanidis C (2014) Enhancing education through natural interaction with physical paper. Univ Access Inf Soc:1–21
Matsushita N, Rekimoto J (1997) Holowall: designing a finger, hand, body, and object sensitive wall. In: ACM symposium on user interface software and technology, pp 209–210
Michel D, Argyros AA, Grammenos D, Zabulis X, Sarmis T (2009) Building a multi-touch display based on computer vision techniques. In: IAPR conference on machine vision applications, pp 74–77
Nocedal J, Wright SJ (2006) Numerical optimization, 2nd edn. Springer, New York
Ntelidakis A, Zabulis X, Grammenos D, Koutlemanis P (2015) Lateral touch detection and localization for interactive, augmented planar surfaces. In: International symposium on visual computing
Oikonomidis I, Kyriazis N, Argyros A (2011) Efficient model-based 3d tracking of hand articulations using Kinect. In: British machine vision conference, pp 101.1–101.11
Oikonomidis I, Kyriazis N, Argyros A (2011) Efficient model-based 3d tracking of hand articulations using kinect. In: British machine vision conference (BMVC 2011), vol 1. BMVA, Dundee, UK, pp 1–11
Rakkolainen I, Palovuori K (2005) Laser scanning for the interactive walk-through fogScreen. In: ACM symposium on virtual reality software and technology, pp 224–226
Rekimoto J (2002) Smartskin: an infrastructure for freehand manipulation on interactive surfaces. In: SIGCHI conference on human factors in computing systems, pp 113–120
Saponas S, Harrison C, Benko H (2011) Pockettouch: Through-fabric capacitive touch input. ACM, New York, NY, USA
Schoning J, Brandl P, Daiber F, Echtler F, Hilliges O, Hook J, Lochtefeld M, Motamedi N, Muller L, Olivier P, Roth T, von Zadow U (2008) Multi-touch surfaces: a technical guide. Tech rep
Smisek J, Jancosek M, Pajdla T (2011) 3D with kinect. In: IEEE international conference on computer vision workshops, pp 1154–1160
Song P, Winkler S, Gilani S, Zhou Z (2007) Vision-based projected tabletop interface for finger interactions. In: ICCV, lecture notes in computer science, vol 4796. Springer, pp 49–58
Streitz N, Tandler P, Müller-Tomfelde C, Konomi S (2001) Roomware: towards the next generation of human-computer interaction based on an integrated design of real and virtual worlds. Human-computer interaction in the New Millenium, Addison Wesley, pp 551–576
Takeoka Y, Miyaki T, Rekimoto J (2010) Z-touch: an infrastructure for 3d gesture interaction in the proximity of tabletop surfaces. In: ACM international conference on interactive tabletops and surfaces. ACM, New York, NY, USA, pp 91–94
Walker G (2011) Camera-based optical touch technology. Information Display 3:30–34
Wilson A (2005) Playanywhere: a compact interactive tabletop projection-vision system. In: ACM symposium on user interface software and technology, New York, NY, USA, pp 83–92
Wilson A (2010) Using a depth camera as a touch sensor. In: ACM international conference on interactive tabletops and surfaces, New York, NY, USA, pp 69–72
Wilson A, Benko H (2010) Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: ACM symposium on user interface software and technology, pp 273–282
Xiao R, Harrison C, Hudson S (2013) Worldkit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces. In: Human factors in computing systems, pp 879–888
Zabulis X, Baltzakis H, Argyros A (2010) Vision-based hand gesture recognition for human-computer interaction. In: Stephanidis C (ed) The universal access handbook, chap 34. Lawrence Erlbaum Associates, Inc, pp 34.1–34.30
Zabulis X, Koutlemanis P, Grammenos D (2012) Augmented multitouch interaction upon a 2-DOF rotating disk. In: International symposium on visual computing, pp 642–653
Acknowledgments
This work has been supported by the FORTH-ICS internal RTD Programme “Ambient Intelligence and Smart Environments”.
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
(MP4 51.6 MB)
(MP4 3.15 MB)
(MP4 16.4 MB )
Rights and permissions
About this article
Cite this article
Ntelidakis, A., Zabulis, X., Grammenos, D. et al. Touch detection for planar interactive displays based on lateral depth views. Multimed Tools Appl 76, 12683–12707 (2017). https://doi.org/10.1007/s11042-016-3695-5
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-016-3695-5