Skip to main content

Advertisement

Log in

A novel muscle-computer interface for hand gesture recognition using depth vision

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

Muscle computer Interface (muCI), one of the widespread human-computer interfaces, has been widely adopted for the identification of hand gestures by using the electrical activity of muscles. Although multi-modal theory and machine learning algorithms have made enormous progress in muCI over the last decades, the processing of the collecting and labeling large data sets creates a high workload and leads to time-consuming implementations. In this paper, a novel muCI was developed to integrate the advantages of EMG signals and depth vision, which could be used to automatically label the cluster of EMG data collected using depth vision. A three layers hierarchical k-medoids approach was designed to extract and label the clustering feature of ten hand gestures. A multi-class linear discriminant analysis algorithm was applied to build the hand gesture classifier. The results showed that the proposed algorithm had high accuracy and the muCI performed well, which could automatically label the hand gesture in all experiments. The proposed muCI can be utilized for hand gesture recognition without labeling the data in advance and has the potential for robot manipulation and virtual reality applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. Robot Operating System, http://www.ros.org/.

References

  • Akhlaghi N, Dhawan A, Khan A, Mukherjee B, Diao G, Truong C, Sikdar S (2019) Sparsity analysis of a sonomyographic muscle-computer interface. IEEE Trans Biomed Eng

  • Alam MS, Arefin AS (2017) Real-time classification of multi-channel forearm emg to recognize hand movements using effective feature combination and lda classifier. Bangladesh J Med Phys 10(1):25–39

    Article  Google Scholar 

  • Almasre MA, Al-Nuaim H (2016) A real-time letter recognition model for arabic sign language using kinect and leap motion controller v2. Int J Adv Eng Manag Sci 2(5).

  • Bieck R, Fuchs R, Neumuth T (2019) Surface emg-based surgical instrument classification for dynamic activity recognition in surgical workflows. Curr Dir Biomed Eng 5(1):37–40

    Article  Google Scholar 

  • Cheng H, Yang L, Liu Z (2015) Survey on 3d hand gesture recognition. IEEE Trans Circuits Syst Video Technol 26(9):1659–1673

    Article  Google Scholar 

  • Chowdhury A, Ramadas R, Karmakar S (2013) Muscle computer interface: a review. In ICoRD’13. Springer, Berlin, pp 411–421

    Google Scholar 

  • Duda RO, Hart PE, Stork DG (2001) Pattern classification Newyork. Wiley, Newyork

    MATH  Google Scholar 

  • Fattah SA, Iqbal O, Zahin S, Shahnaz C, Rosul G (2017) Basic hand action classification based on surface emg using autoregressive reflection coefficient. In: TENCON 2017-2017 IEEE Region 10 Conference, pp 1369–1374. IEEE

  • Huh J-H (2018) Big data analysis for personalized health activities: machine learning processing for automatic keyword extraction approach. Symmetry 10(4):93

    Article  MathSciNet  Google Scholar 

  • Khan MNH, Arovi MAH, Mahmud H, Hasan MK, Rubaiyeat HA (2015) Speech based text correction tool for the visually impaired. In: 2015 18th International Conference on Computer and Information Technology (ICCIT), pp 150–155. IEEE

  • Lee S, Park K, Lee J, Kim K (2017) User study of vr basic controller and data glove as hand gesture inputs in vr games. In: 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR), pp 1–3. IEEE

  • Li Z, Wang B, Sun F, Yang C, Xie Q, Zhang W (2013a) semg-based joint force control for an upper-limb power-assist exoskeleton robot. IEEE J Biomed Health Inf 18(3):1043–1050

    Google Scholar 

  • Li Z, Wang B, Yang C, Xie Q, Su C-Y (2013b) Boosting-based emg patterns classification scheme for robustness enhancement. IEEE J Biomed Health Inf 17(3):545–552

    Article  Google Scholar 

  • Li Z, Xia Y, Su CY (2015) Intelligent networked teleoperation control. Springer, Berlin

    Book  Google Scholar 

  • Li H, Wu I, Wang H, Han C, Quan W, Zhao JP (2019) Hand gesture recognition enhancement based on spatial fuzzy matching in leap motion. IEEE Trans Indust Inf

  • Li Z, Li J, Zhao S, Yuan Y, Kang Y, Chen CP (2018) Adaptive neural control of a kinematically redundant exoskeleton robot using brain-machine interfaces. IEEE Trans Neural Networks Learn Syst

  • Lobov S, Krilova N, Kastalskiy I, Kazantsev VB, Makarov VA (2016) A human-computer interface based on electromyography command-proportional control. NEUROTECHNIX 57–64

  • Lu Z, Chen X, Li Q, Zhang X, Zhou P (2014) A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices. IEEE Trans Human Mach Syst 44(2):293–299

    Article  Google Scholar 

  • De Marsico M, Levialdi S, Nappi M, Ricciardi S (2014) Figi: floating interface for gesture-based interaction. J Ambient Intell Humaniz Comput 5(4):511–524

    Article  Google Scholar 

  • McIntosh J, McNeill C, Fraser M, Kerber F, Löchtefeld M, Krüger A (2016) Empress: Practical hand gesture classification with wrist-mounted emg and pressure sensing. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp 2332–2342. ACM

  • Milosevic B, Farella E, Benatti S (2018) Exploring arm posture and temporal variability in myoelectric hand gesture recognition. In: 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob), pp 1032–1037. IEEE

  • Mirehi N, Tahmasbi M, Targhi AT (2019) Hand gesture recognition using topological features. Multimed Tools Appl 1–26

  • Phinyomark A, Quaine F, Charbonnier S, Serviere C, Tarpin-Bernard F, Laurillau Y (2013a) A feasibility study on the use of anthropometric variables to make muscle-computer interface more practical. Eng Appl Artif Intell 26(7):1681–1688

    Article  Google Scholar 

  • Phinyomark A, Quaine F, Laurillau Y, Thongpanja S, Limsakul C, Phukpattaranont P (2013b) Emg amplitude estimators based on probability distribution for muscle-computer interface. Fluctuat Noise Lett 12(03):1350016

    Article  Google Scholar 

  • Phinyomark A, Scheme E (2018) Emg pattern recognition in the era of big data and deep learning. Big Data Cognit Comput 2(3):21

    Article  Google Scholar 

  • Phinyomark A, Thongpanja S, Quaine F, Laurillau Y, Limsakul C, Phukpattaranont P (2013c) Optimal emg amplitude detectors for muscle-computer interface. In: 2013 10th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, pp 1–6. IEEE

  • Quesada L, López G, Guerrero L (2017) Automatic recognition of the american sign language fingerspelling alphabet to assist people living with speech or hearing impairments. J Ambient Intell Humaniz Comput 8(4):625–635

    Article  Google Scholar 

  • Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43(1):1–54

    Article  Google Scholar 

  • Rechy-Ramirez EJ, Marin-Hernandez A, Rios-Figueroa HV (2018) Impact of commercial sensors in human computer interaction: a review. J Ambient Intell Humaniz Comput 9(5):1479–1496

    Article  Google Scholar 

  • Rossi M, Benatti S, Farella E, Benini L (2015) Hybrid emg classifier based on hmm and svm for hand gesture recognition in prosthetics. In: 2015 IEEE International Conference on Industrial Technology (ICIT), pp 1700–1705. IEEE

  • Saponas TS, Tan DS, Morris D, Turner J, Landay JA (2010) Making muscle-computer interfaces more practical. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp 851–854. ACM

  • Seo Y-S, Huh J-H (2019) Automatic emotion-based music classification for supporting intelligent iot applications. Electronics 8(2):164

    Article  Google Scholar 

  • Simao M, Gibaru O, Neto P (2019) Online recognition of incomplete gesture data to interface collaborative robots. IEEE Trans Ind Electron

  • Su H, Qi W, Hu Y, Sandoval J, Zhang L, Schmirander Y, Chen G, Aliverti A, Knoll A, Ferrigno G et al (2019a) Towards model-free tool dynamic identification and calibration using multi-layer neural network. Sensors 19(17):3636

    Article  Google Scholar 

  • Su H, Qi W, Yang C, Sandoval J, Ferrigno G, Momi ED (2020) Deep neural network approach in robot tool dynamics identification for bilateral teleoperation. IEEE Robot Autom Lett 5(2):2943–2949

    Article  Google Scholar 

  • Su H, Sandoval J, Vieyres P, Poisson G, Ferrigno G, De Momi E (2018) Safety-enhanced collaborative framework for tele-operated minimally invasive surgery using a 7-dof torque-controlled robot. Int J Control Autom Syst 16(6):2915–2923

    Article  Google Scholar 

  • Su H, Qi W, Yang C, Aliverti A, Ferrigno G, De Momi E (2019b) Deep neural network approach in human-like redundancy optimization for anthropomorphic manipulators. IEEE Access

  • Su H, Schmirander Y, Li Z, Zhou X, Ferrigno G, De Momi E (2020) Bilateral teleoperation control of a redundant manipulator with an rcm kinematic constraint. In: 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE

  • Thabet E, Khalid F, Sulaiman PS, Yaakob R (2018) Fast marching method and modified features fusion in enhanced dynamic hand gesture segmentation and detection method under complicated background. J Ambient Intell Humaniz Comput 9(3):755–769

    Article  Google Scholar 

  • Too J, Abdullah A, Zawawi TT, Saad NM, Musa H (2017) Classification of emg signal based on time domain and frequency domain features. Int J Human Technol Interact (IJHaTI) 1(1):25–30

    Google Scholar 

  • Ugolotti R, Sassi F, Mordonini M, Cagnoni S (2013) Multi-sensor system for detection and classification of human activities. J Ambient Intell Humaniz Comput 4(1):27–41

    Article  Google Scholar 

  • Vaitkevičius A, Taroza M, Blažauskas T, Damaševičius R, Maskeliūnas R, Woźniak M (2019) Recognition of american sign language gestures in a virtual reality using leap motion. Appl Sci 9(3):445

    Article  Google Scholar 

  • Vernon S, Joshi SS (2011) Brain-muscle-computer interface: Mobile-phone prototype development and testing. IEEE Trans Inf Technol Biomed 15(4):531–538

    Article  Google Scholar 

  • Wachs JP, Kölsch M, Stern H, Edan Y (2011) Vision-based hand-gesture applications. Commun ACM 54(2):60–71

    Article  Google Scholar 

  • Wang M, Callaghan V, Bernhardt J, White K, Peña-Rios A (2018b) Augmented reality in education and training: pedagogical approaches and illustrative case studies. J Ambient Intell Humaniz Comput 9(5):1391–1402

    Article  Google Scholar 

  • Wang F, Cui S, Yuan S, Fan J, Sun W, Tian F (2018a) Myotyper: A myo-based texting system for forearm amputees. In: Proceedings of the Sixth International Symposium of Chinese CHI, pp 144–147. ACM

  • Zhang X, Chen X, Li Y, Lantz V, Wang K, Yang J (2011) A framework for hand gesture recognition based on accelerometer and emg sensors. IEEE Trans Syst Man Cybernet Part A 41(6):1064–1076

    Article  Google Scholar 

  • Zhao Y, Wang X, Goubran M, Whalen T, Petriu EM (2013) Human emotion and cognition recognition from body language of the head using soft computing techniques. J Ambient Intell Humaniz Comput 4(1):121–140

    Article  Google Scholar 

Download references

Funding

This work was supported by the European Unions Horizon 2020 research and innovation program under SMARTsurg project Grant Agreement No. 732515.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hang Su.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, X., Qi, W., Ovur, S.E. et al. A novel muscle-computer interface for hand gesture recognition using depth vision. J Ambient Intell Human Comput 11, 5569–5580 (2020). https://doi.org/10.1007/s12652-020-01913-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-020-01913-3

Keywords

Navigation