Skip to main content
Log in

Mouse operation on monitor by interactive analysis of intuitive hand motions

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The natural user interface/experience (NUI/NUX) is used for a natural motion interface without using a device or tool such as a mouse, keyboard, pen, or marker. Up to now, typical motion recognition methods have used markers to receive coordinate input values as relative data and store them in a database. However, to recognize accurate motion, more markers are needed, and much time is required to attach the makers and process the data. In addition, because the NUI/NUX framework is developed using only basic intuition, use problems arise, which force users to learn many NUI/NUX framework usages. To compensate for this problem, in this paper, we design a multi-modal NUI/NUX framework controlled by voice, gesture motion, and facial expression simultaneously, and propose a new algorithm for mouse operations by analyzing intuitive hand gestures and mapping them on the monitor. We also implement a “dynamic mouse area,” which enables people of all ages to handle the “hand mouse” operation easily and intuitively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Bau O, E. Mackay W (2008) OctoPocus: a dynamic guide for learning gesture-based command sets. Proceedings of the 21st Annual ACM Symposium on Unser Interface Software and Technology, pp. 37–46

  2. Chaing I, Tsai J (2012) Using Xbox360 Kinect games on enhancing visual performance skills on institutionalized older adults with wheelchairs. Proceedings of the 4th IEEE Int’l Conference On Digital Game and Intelligent Toy Enhanced Learning (DIGITEL 2012), pp.263-267

  3. Chang S, Chang H, Yen S, Shih T (2013) Panoramic human structure maintenance based on invariant feature of video frames. Human-centric Computing and Information Sciences 3:14, doi:10.1186/2192-1962-3-14

  4. Cho H, Choi M (2014) Personal mobile album/diary application development. J Converg 5(1):32–37

    Google Scholar 

  5. Christou G (2013) A comparison between experienced and inexperienced video game players’ perceptions. Human-centric Computing and Information Sciences 3:15, doi:10.1186/2192-1962-3-15

  6. Ghimire D, Lee J (2013) A robust face detection method based on skin color and edges. J Inf Process Syst 9(1):141–156

    Article  Google Scholar 

  7. Ghimire D, Lee J (2014) Extreme learning machine ensemble using bagging for facial expression recognition. J Inf Process Syst 10(3):443–458

    Article  Google Scholar 

  8. Goth G (2012) Brave NUI world. Commun ACM 54(12):14–16

    Article  Google Scholar 

  9. Henrique C, Forster Q (2007) Design of Gesture Vocabularies through Analysis of Recognizer Performance in Gesture Space. Intelligent Systems Design and Applications, pp.641-646

  10. Ince I, Socarras-Garzon M, Yang T (2010) Hand mouse: real time hand motion detection system based on analysis of finger blobs. Int J Digit Content Technol Appl 4(2):40–56

    Article  Google Scholar 

  11. Jeon I, Nam B (2012) Implementation of hand mouse based on depth sensor of the Kinect. Proceeding of the 43rd KIEE (The Korean Institute of Electrical Engineers) Annual Summer Conference, pp. 18–20.

  12. Kjeldsen R, Kender J (1996) Toward the use of gesture in traditional user interfaces. Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition, pp.151-156.

  13. Nagi G, Rahmat R, Khalid F, Taufik M (2013) Region-based facial expression recognition in still images. J Inf Process Syst 9(1):173–188

    Article  Google Scholar 

  14. Ng C, Fam J, Ee G, Noordin N (2013) Finger triggered virtual musical instruments. J Converg 4(1):39–46

    Google Scholar 

  15. Oh J, Kim H, Moon H (2014) A study on the diffusion of digital interactive e-books - the development of a user experience model. J Converg 5(2):21–27

    Google Scholar 

  16. Ohya J, Kitamura Y (1993) Real-time reproduction of 3D human images in virtual space teleconferencing. Proceedings of the 1993 I.E. Virtual Reality Annual International Symppsium, pp. 408–414.

  17. Richard A (1980) Put-That-There: voice and gesture at the graphics interface. International Conference on Computer Graphics and Interactive Techniques, Association for Computer Machinery, pp. 262–270.

  18. Sanchez-Nielsen E, Anton-Canalis L, Guerra-Artal C (2005) An autonomous and user-independent hand posture recognition system for vision-based interface tasks. Proceedings of the 11th Conference of the Spanish Association for Artificial Intelligence, pp. 113–122

  19. Shahabi C, Kim S, Nocera L, Constantinou G, Lu Y, Cai Y, Medioni G, Nevatia R, Banaei-Kashani F (2014) Janus – multi source event detection and collection system for effective surveillance of criminal activity. J Inf Process Syst 10(1):1–22

    Article  Google Scholar 

  20. Shiratuddin M, Wong K (2011) Non-contact multi-hand gestures interaction techniques for architectural design in a virtual environment. Proceedings of the International Conference on IT & Multimedia at UNITEN (ICIMU 2011), pp 1–6

  21. Verma P, Singh R, Singh A (2013) A framework to integrate speech based interface for blind web users on the websites of public interest. Human-centric Computing and Information Sciences 3:21, doi:10.1186/2192-1962-3-21

  22. Virpi R, Effie L, Arnold V, Jettie H (2011) USER EXPERIENCE WHITE PAPER. Bringing clarity to the concept of user experience, Result from Dagstuhl Seminar on Demarcating User Experience, Feburary 11

Download references

Acknowledgment

This research was supported by the MSIP (Ministry of Science, ICT and Future Planning), Korea, under the ITRC (Information Technology Research Center)) support program (NIPA-2013-H0301-13-4007) supervised by the NIPA (National IT Industry Promotion Agency)”

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dongkyoo Shin.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, G., Shin, D. & Shin, D. Mouse operation on monitor by interactive analysis of intuitive hand motions. Multimed Tools Appl 75, 15261–15274 (2016). https://doi.org/10.1007/s11042-014-2357-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-014-2357-8

Keywords

Navigation