Skip to main content
Log in

Object manipulation and deformation using hand gestures

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

Compared to the mouse, as a two-dimensional and precise interface device, hand gestures provide more degrees of freedom for users to interact with computers by employing intelligent computing methods. Leap Motion Controller is gaining more popularity due to its ability to detect and track hand joints in three dimensions. However, in some cases, the Leap Motion Controller measurements are not correct enough. We show that the occlusion, palm angle, and the limited field of view are the main downsides of the Leap Motion Controller. In this paper, a framework is proposed to manipulate and deform a three-dimensional object by hand gestures. We select only a few gestures so that the system instructions can be easily memorized. The gestures are not defined very strictly, so users can do them properly without getting tired. We propose that calculating a reliable space from a Leap Motion Controller can significantly reduce these problems. To deform objects, the Free Form Deformation technique is used, which allows for more local deformation. The selected gestures and determined space for interaction make the deformation framework achieve a balance between the accuracy, user-factors, required tasks for deformation, and limitation of the hand tracking device. The proposed method, compared to related studies, offers more creative methods for deforming objects and more natural movements to interact with the system. According to the conducted user study, a significant difference is observed between hand gesture interaction and mouse in terms of speed and number of attempts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24

Similar content being viewed by others

Availability of data and material

Data will be available on reasonable request.

References

  • Alkemade R, Verbeek FJ, Lukosch SG (2017) On the efficiency of a VR hand gesture-based interface for 3D object manipulations in conceptual design. J Hum-Comput Interact 33:882–901

    Article  Google Scholar 

  • Arshad H, Chowdhury SA, Chun LM, Parhizkar B, Obeidy WK (2016) A freeze-object interaction technique for handheld augmented reality systems. Multimed Tools Appl 75:5819–5839

    Article  Google Scholar 

  • Barbieri L, Bruno F, Cosco F, Muzzupappa M (2014) Effects of device obtrusion and tool-hand misalignment on user performance and stiffness perception in visuo-haptic mixed reality. Int J Hum Comput Stud 72:846–859

    Article  Google Scholar 

  • Billinghurst SS, Vu K-PL (2015) Touch screen gestures for web browsing tasks. Comput Hum Behav 53:71–81

    Article  Google Scholar 

  • Chen Z, Ma X, Peng Z, Zhou Y, Yao M, Ma Z, Wang C, Gao Z, Shen M (2018) User-defined gestures for gestural interaction: extending from hands to other body parts. Int J Hum-Comput Interact 34:238–250

    Article  Google Scholar 

  • Cui J, Sourin A (2018) Mid-air interaction with optical tracking for 3D modeling. Comput Graph 74:1–11

    Article  Google Scholar 

  • Curiel-Razo Y-I, Icasio-Hernández O, Sepúlveda-Cervantes G, Hurtado-Ramos J-B, González-Barbosa J-J (2016) Leap motion controller three dimensional verification and polynomial correction. Measurement 93:258–264

    Article  Google Scholar 

  • Datcu D, Lukosch S, Brazier F (2015) On the usability and effectiveness of different interaction types in augmented reality. Int J Hum-Comput Interact 31:193–209

    Article  Google Scholar 

  • Goza SM, Ambrose RO, Diftler MA, Spain IM (2004) Telepresence control of the NASA/DARPA robonaut on a mobility platform. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 623–629. https://doi.org/10.1145/985692.985771

  • Grabowski A (2015) Sense of touch in training tasks demanding high precision and short time of execution. Int J Hum-Comput Interact 31:861–868

    Article  Google Scholar 

  • Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14:3702–3720

    Article  Google Scholar 

  • Hernoux F, Christmann O (2015) A seamless solution for 3D real-time interaction: design and evaluation. Virtual Reality 19:1–20

    Article  Google Scholar 

  • Hoshino K (2017) Hand gesture interface for entertainment games. In: Nakatsu R, Rauterberg M, Ciancarini P (eds) Handbook of digital games and entertainment technologies. Springer, Singapore. pp 293–312. https://doi.org/10.1007/978-981-4560-50-4_47

    Chapter  Google Scholar 

  • Kajastila R, Lokki T (2013) Eyes-free interaction with free-hand gestures and auditory menus. Int J Hum Comput Stud 71:627–640

    Article  Google Scholar 

  • Korayem M, Madihi M, Vahidifar V (2021) Controlling surgical robot arm using leap motion controller with Kalman filter. Measurement 178:109372

    Article  Google Scholar 

  • Lai J, Zhang D, Wang S (2017) ContextZoom: a single-handed partial zooming technique for touch-screen mobile devices. Int J Human-Comput Interact 33:475–485

    Article  Google Scholar 

  • Liang H, Chang J, Kazmi IK, Zhang JJ, Jiao P (2017) Hand gesture-based interactive puppetry system to assist storytelling for children. Vis Comput 33:517–531

    Article  Google Scholar 

  • Liu X, Cui X, Song G, Xu B (2014) Development of a virtual maintenance system with virtual hand. Int J Adv Manuf Technol 70:2241–2247

    Article  Google Scholar 

  • Lou X, Peng R, Hansen P, Li XA (2018) Effects of user’s hand orientation and spatial movements on free hand interactions with large displays. Int J Hum-Comput Interact 34:519–532

    Article  Google Scholar 

  • Lu G, Shark L-K, Hall G, Zeshan U (2012) Immersive manipulation of virtual objects through glove-based hand gesture interaction. Virtual Reality 16:243–252

    Article  Google Scholar 

  • Mahdikhanlou K, Ebrahimnezhad H (2020) Multimodal 3D American sign language recognition for static alphabet and numbers using hand joints and shape coding. Multimed Tools Appl 79:22235–22259

    Article  Google Scholar 

  • Maleki B, Ebrahimnezhad H (2015) Intelligent visual mouse system based on hand pose trajectory recognition in video sequences. Multimed Syst 21:581–601

    Article  Google Scholar 

  • Martins R, Notargiacomo P (2021) Evaluation of leap motion controller effectiveness on 2D game environments using usability heuristics. Multimed Tools Appl 80:5539–5557

    Article  Google Scholar 

  • Murugappan S, Liu H, Ramani K (2013) Shape-It-Up: Hand gesture based creative expression of 3D shapes using intelligent generalized cylinders. Comput Aided Des 45:277–287

    Article  Google Scholar 

  • Oropesa I, de Jong T, Sánchez-González P, Dankelman J, Gómez E (2016) Feasibility of tracking laparoscopic instruments in a box trainer using a Leap Motion Controller. Measurement 80:115–124

    Article  Google Scholar 

  • Park J, Jung ES, Park S (2017) input behavior when using two fingers on a multi-touch device. Int J Hum-Comput Interact 33:911–926

    Article  Google Scholar 

  • Paulson B, Cummings D, Hammond T (2011) Object interaction detection using hand posture cues in an office setting. Int J Hum Comput Stud 69:19–29

    Article  Google Scholar 

  • Połap D (2018) Human-machine interaction in intelligent technologies using the augmented reality. Inf Technol Control 47:691–703

    Google Scholar 

  • Połap D, Kęsik K, Winnicka A, Woźniak M (2020) Strengthening the perception of the virtual worlds in a virtual reality environment. ISA Trans 102:397–406

    Article  Google Scholar 

  • Ponraj G, Ren H (2018) Sensor fusion of leap motion controller and flex sensors using Kalman filter for human finger tracking. IEEE Sens J 18:2042–2049

    Article  Google Scholar 

  • Preece J, Sharp H, Rogers Y (2015) Interaction design: beyond human-computer interaction. Wiley

    Google Scholar 

  • Quesada L, López G, Guerrero L (2017) Automatic recognition of the American sign language fingerspelling alphabet to assist people living with speech or hearing impairments. J Ambient Intell Humaniz Comput 8:625–635

    Article  Google Scholar 

  • Ramani K (2015) A gesture-free geometric approach for mid-air expression of design intent in 3D virtual pottery. Comput Aided Des 69:11–24

    Article  Google Scholar 

  • Ramani K (2016) Extracting hand grasp and motion for intent expression in mid-air shape deformation: a concrete and iterative exploration through a virtual pottery application. Comput Graph 55:143–156

    Article  Google Scholar 

  • Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43:1–54

    Article  Google Scholar 

  • Schultz M, Gill J, Zubairi S, Huber R, Gordin F (2003) Bacterial contamination of computer keyboards in a teaching hospital. Infect Control Hosp Epidemiol 24:302–303

    Article  Google Scholar 

  • Sederberg TW, Parry SR (1986) Free-form deformation of solid geometric models. ACM SIGGRAPH Comput Graph 20(4):151–160. https://doi.org/10.1145/15886.15903

    Article  Google Scholar 

  • Sharma S, Singh S (2021) Vision-based hand gesture recognition using deep learning for the interpretation of sign language. Expert Syst Appl 182:115657. https://doi.org/10.1016/j.eswa.2021.115657

    Article  Google Scholar 

  • Shen Y, Ong S-K, Nee AY (2011) Vision-based hand interaction in augmented reality environment. Int J Hum-Comput Interact 27:523–544

    Article  Google Scholar 

  • Shen Y, Gu P, Ong S-K, Nee AY (2012) A novel approach in rehabilitation of hand-eye coordination and finger dexterity. Virtual Reality 16:161–171

    Article  Google Scholar 

  • Shim J, Yang Y, Kang N, Seo J, Han T-D (2016) Gesture-based interactive augmented reality content authoring system using HMD. Virtual Reality 20:57–69

    Article  Google Scholar 

  • Starner T, Pentland A (1997) Real-time american sign language recognition from video using hidden markov models. In: Shah M, Jain R (eds) Motion-based recognition. Springer

  • Swindells C, Inkpen KM, Dill JC, Tory M (2002) That one there! Pointing to establish device identity. In: Proceedings of the 15th annual ACM symposium on User interface software and technology, pp 151–160. https://doi.org/10.1145/571985.572007

  • Ueng S-K, Chen G-Z (2016) Vision based multi-user human computer interaction. Multimed Tools Appl 75:10059–10076

    Article  Google Scholar 

  • Wang K, Xiao B, Xia J, Li D, Luo W (2016) A real-time vision-based hand gesture interaction system for virtual EAST. Fusion Eng Des 112:829–834

    Article  Google Scholar 

  • Wixon D, Wilson C (1997) The usability engineering framework for product design and evaluation. In: Helander MG, Landauer TK, Prabhu PV (eds) Handbook of human-computer interaction, 2nd edn. Elsevier

  • Wu Y, Schmidt L, Parker M, Strong J, Bruns M, Ramani VK (2012) ACTIVE-Hand: automatic configurable tactile interaction in virtual environment. In: Proceedings of ASME. IDETC-CIE 2012: 32nd computers and information in engineering conference, Parts A and B, vol 2. American Society of Mechanical Engineers, pp 1481–1490

  • Wu H, Wang J, Zhang XL (2016) User-centered gesture development in TV viewing environment. Multimed Tools Appl 75:733–760

    Article  Google Scholar 

  • Wu C-M, Hsu C-W, Lee T-K, Smith S (2017) A virtual reality keyboard with realistic haptic feedback in a fully immersive virtual environment. Virtual Real 21:19–29

    Article  Google Scholar 

  • Zhao M, Ong S-K, Nee AY (2016) An augmented reality-assisted therapeutic healthcare exercise system based on bare-hand interaction. Int J Hum-Comput Interact 32:708–721

    Article  Google Scholar 

Download references

Funding

There is no funding.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hossein Ebrahimnezhad.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mahdikhanlou, K., Ebrahimnezhad, H. Object manipulation and deformation using hand gestures. J Ambient Intell Human Comput 14, 8115–8133 (2023). https://doi.org/10.1007/s12652-021-03582-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-021-03582-2

Keywords

Navigation