Skip to main content
Log in

An application oriented and shape feature based multi-touch gesture description and recognition method

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In order to customize multi-touch gestures for different applications, and facilitate multi-touch gesture recognition, an application oriented and shape feature based multi-touch gesture description and recognition method is proposed. In this method, multi-touch gestures are classified into two categories, namely atomic gesture and combined gesture, where combined gesture is a combination of atomic gestures using temporal, spatial and logical relationships. For description, users’ motions are mapped into gestures, and then semantic constraints of an application are extracted to build the accessible relationships between gestures and entity states. For recognition, trajectories of a gesture are projected onto an image, and the shape feature of every trajectory and relationships between each other are extracted to match with gesture templates. Experiments show that this method is independent to multi-touch platforms, robust to manipulating differences of users, and it is scalable and reusable for users and applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Allen JF, Ferguson G (1997) Actions and events in interval temporal logic. In: Stock O (ed) Spatial and temporal reasoning. Kluwer, Netherlands, pp 205–245

  2. Bailador G, Roggen D, Tröster G, Triviño G (2007) Real time gesture recognition using continuous time recurrent neural networks. BodyNets ’07: Proceedings of the ICST 2nd international conference on Body area networks, Florence, Italy, pp 1–8

  3. Bhuyan MK, Ghosh D, Bora PK (2005) Estimation of 2D motion trajectories from video object planes and its application in hand gesture recognition. PReMI’05: First International Conference on Pattern Recognition and Machine Intelligence, Indian Statistical Institute, Kolkata, December 18–22, LNCS 3776, pp 509–514

  4. Bhuyan MK, Ghosh D, Bora PK (2006) Feature extraction from 2D gesture trajectory in dynamic hand gesture recognition. IEEE Conference on Cybernetics and Intelligent Systems, Bangkok, Thailand, Volume 2, pp 1–6

  5. Bimber O (1999) Continuous 6 dof gesture recognition: a fuzzy logic approach. In Proceedings of VII International Conference in Central Europe on Computer Graphics, Visualization and Interactive Digital Media, Campus Bory, Plzen—Bory, Czech Republic, pp 24–30

  6. Bobick AF, Davis JW (2001) The recognition of human movement using temporal templates. IEEE Trans Pattern Anal Mach Intell, March, 23(3):257–267

    Article  Google Scholar 

  7. Buxton B (2008) Multi-touch systems that i have known and loved. http://www.billbuxton.com/multitouchOverview.html, Access:Nov 11, 2008

  8. Chen HH, Su JS (1986) A syntactic approach to shape recognition. In: Proc Intel Computer Symp. Tainan, Taiwan, December, pp 103–122

  9. Chen Y-T, Tseng K-T (2007) Multiple-angle hand gesture recognition by fusing SVM classifiers. In: Proceedings of the 3rd IEEE International Conference on Automation Science and Engineering, Scottsdale, Arizona, USA, pp 527–530

  10. Elias G, Westerman C, Haggerty, Myra M (2007) Multi-touch gesture directory. United States Patent Application: 20070177803

  11. Fact++, http://code.google.com/p/factplusplus/

  12. Fu CW, Goh WB, Ng JA (2010) Multi-touch techniques for exploring large-scale 3D astrophysical simulations. In CHI ’10: Proceedings of the 28th International Conference on Human Factors in Computing Systems, Atlanta, Georgia, USA, April 10–15, pp 2213–2222

  13. Han JY (2005) Low-cost multi-touch sensing through frustrated total internal reflection. Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology, Seattle, WA, USA, October 23–27, pp 315–319

  14. Hausdorff distance, http://en.wikipedia.org/wiki/Hausdorff_distance

  15. Hayes ST, Hooten ER, Adams JA (2010) Multi-touch interaction for tasking robots, HRI ’10: Proceeding of the 5th ACM/IEEE International Conference on Human–robot Interaction, Osaka, Japan, pp 97–98

  16. Hong P, Turk M, Huang T (2000) Gesture modeling and recognition using finite state machines. In: Proc. IEEE Intel Conference on Automated Face and Gesture Recognition(FG’00), Grenoble, France, pp 410–415

  17. iPhone (2008) http://www.apple.com/iphone, Nov 11

  18. Jeannin S, Divakaran A (2001) MPEG-7 visual motion descriptors. IEEE Trans Circuits Syst Video Technol 11(6):720–724

    Article  Google Scholar 

  19. Kim D, Song J, Kim D (2007) Simultaneous gesture segmentation and recognition based on forward spotting accumulative HMMs. Pattern Recogn 40(11):3012–3026

    Article  MATH  Google Scholar 

  20. Lao S, Heng X, Zhang G, Ling Y, Wang P (2009) A gestural interaction design model for multi-touch displays. Proceedings of the 2009 British Computer Society Conference on Human–Computer Interaction, Cambridge, United Kingdom, pp 440–446

  21. Lee Y-H, Tsai C-Y (2009) Taiwan sign language (TSL) recognition based on 3D data and neural networks. Expert Syst Appl 36(2):1123–1128

    Article  Google Scholar 

  22. Lichtenauer JF, Hendriks EA, Reinders MJT (2008) Sign language recognition by combining statistical DTW and independent classification. In IEEE Transactions on Pattern Analysis and Machine Intelligence 30(11):2040–2046

  23. Malik S, Ranjan A, Balakrishnan R. Interacting with large displays from a distance with vision-tracked multi-finger gestural input. Proc. UIST ’05. ACM Press, New York, pp 43–52

  24. Micire M, Desai M, Courtemanche A, Tsui KM, Yanco HA (2009) Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces. ITS ’09: Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, Banff, Alberta, Canada, November, pp 41–48

  25. Micire M, Drury JL, Keyes B, Yanco HA (2009) Multi-touch Interaction for Robot Control, IUI ’09: Proceedings of the 13th International Conference on Intelligent User Interfaces, Sanibel Island, Florida, USA, pp 425–428

  26. Microsoft Surface (2008) http://www.microsoft.com/surface/index.html. Nov 11

  27. Moscovich T, Hughes JF. Multi-fingercursor techniques. Proc. GI ’06. CIPS, Toronto, pp 1–7

  28. Noy NF, McGuinness DL (2001) Ontology development 101: a guide to creating your first ontology. Stanford Knowledge Systems Laboratory Technical Report KSL-01-05 and Stanford Medical Informatics Technical Report SMI-2001-0880, March

  29. Ontology, http://en.wikipedia.org/wiki/Ontology

  30. Pellet, http://clarkparsia.com/pellet

  31. Poppe R (2010) A survey on vision-based human action recognition. Image Vis Comput 28:976–990

    Article  Google Scholar 

  32. RacerPro, http://www.racer-systems.com/

  33. Standford University, Reasoners[EB/OL], http://protegewiki.stanford.edu/wiki/Using_Reasoners

  34. Trinkunas J, Vasilecas O. Building ontologies from relational databases using reverse engineering methods. International Conference on Computer Systems and Technologies—CompSysTech’07, Bulgaria, Article No.: 13 1–6

  35. W3C. OWL Web ontology language overview[EB/OL]. (2004-02-10). http://www.w3.org/TR/2004/REC-owl-features-20040210/

  36. Wang D, Liu Q-B, Zhang M-J (2010) A multi-touch platform based on four corner cameras and methods for accurately locating contact points. Multimed Tool Appl. doi:10.1007/s11042-009-0425-2

    Google Scholar 

  37. Webel S, Keil J, Zoellner M (2008) Multi-touch gestural interaction in X3D using hidden Markov models. Proceedings of the 2008 ACM Symposium on Virtual Reality Software and Technology, Bordeaux, France, pp 263–264

  38. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing, CHI ’09: Proceedings of the 27th International Conference on Human Factors in Computing Systems, Boston, MA, USA, pp 1083–1092

  39. Wu M, Balakrishnan R (2003) Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. Proc. UIST ’03. ACM Press, New York, November 02–05, pp 193–202

  40. Wu M, Shen C, Ryall K, Forlines C, Balakrishnan R (2006) Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces. TABLETOP 2006: Proceedings of the First IEEE International Workshop on Horizontal Interactive Human–Computer Systems, Adelaide, Australia, pp 183–190

  41. Yang R, Sarkar S (2006) Gesture recognition using hidden Markov models from fragmented observations. CVPR ’06: Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, New York, NY, pp 766–773

  42. Yuksel BF, Donnerer M, Tompkin J, Steed A (2010) A novel brain–computer interface using a multi-touch surface. CHI ’10: Proceedings of the 28th International Conference on Human Factors in Computing Systems. Atlanta, Georgia, USA, pp 855–858

  43. Zhang DS, Lu G (2004) Review of shape representation and description techniques. Pattern Recogn 37(1):1–19

    Article  MATH  Google Scholar 

Download references

Acknowledgements

This research was partially supported by National Natural Science Foundation (NSFC) of China with project No.60705013, No.60872150, No. 60803101 and No.60773023; China Postdoctoral Science Foundation special funding with project No.200902665, China Postdoctoral Science Foundation with project No.20070410977, Natural Science Foundation of Hunan Province in China with project No.08JJ4018

We would like to thank the participants of our user study for their participation and comments. We would also like to thank the anonymous reviewers for their insightful comments that helped us improving this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to De-xin Wang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wang, Dx., Xiong, Zh. & Zhang, Mj. An application oriented and shape feature based multi-touch gesture description and recognition method. Multimed Tools Appl 58, 497–519 (2012). https://doi.org/10.1007/s11042-011-0730-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-011-0730-4

Keyword

Navigation