Skip to main content

Fusion and Comparison of IMU and EMG Signals for Wearable Gesture Recognition

  • Conference paper
  • First Online:
Book cover Biomedical Engineering Systems and Technologies (BIOSTEC 2015)

Abstract

We evaluate the performance of a wearable gesture recognition system for arm, hand, and finger motions, using the signals of an Inertial Measurement Unit (IMU) worn at the wrist, and the Electromyogram (EMG) of muscles in the forearm. A set of 12 gestures was defined, similar to manipulatory movements and to gestures known from the interaction with mobile devices. We recorded performances of our gesture set by five subjects in multiple sessions. The resulting data corpus is made publicly available to build a common ground for future evaluations and benchmarks. Hidden Markov Models (HMMs) are used as classifiers to discriminate between the defined gesture classes. We achieve a recognition rate of 97.8 % in session-independent, and of 74.3 % in person-independent recognition. We give a detailed analysis of error characteristics and of the influence of each modality to the results to underline the benefits of using both modalities together.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Thalmic Labs Inc., www.thalmic.com.

  2. 2.

    PLUX wireless biosignals S.A., www.plux.info.

References

  1. Alibali, M.W.: Gesture in spatial cognition: Expressing, communicating, and thinking about spatial information. Spat. Cogn. Comput. 5(4), 307–331 (2005)

    Google Scholar 

  2. Amma, C., Georgi, M., Schultz, T.: Airwriting: a wearable handwriting recognition system. Pers. Ubiquit. Comput. 18(1), 191–203 (2014). http://link.springer.com/10.1007/s00779-013-0637-3

    Article  Google Scholar 

  3. Benbasat, A.Y., Paradiso, J.A.: An inertial measurement framework for gesture recognition and applications. In: Wachsmuth, I., Sowa, T. (eds.) GW 2001. LNCS (LNAI), vol. 2298, pp. 9–20. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  4. Chen, X., Zhang, X., Zhao, Z., Yang, J., Lantz, V., Wang, K.: Hand gesture recognition research based on surface emg sensors and 2d-accelerometers. In: 2007 11th IEEE International Symposium on Wearable Computers, pp. 11–14 (2007). http://dblp.org/db/conf/iswc/iswc2007.html#ChenZZYLW07

  5. Cho, S.J., Oh, J.K., Bang, W.C., Chang, W., Choi, E., Jing, Y., Cho, J., Kim, D.Y.: Magic wand: a hand-drawn gesture input device in 3-d space with inertial sensors. In: Ninth International Workshop on Frontiers in Handwriting Recognition, 2004 IWFHR-9 2004, pp. 106–111 October 2004

    Google Scholar 

  6. Hartmann, B., Link, N.: Gesture recognition with inertial sensors and optimized dtw prototypes. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Istanbul, Turkey, 10–13 October 2010. pp. 2102–2109. Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Istanbul, Turkey, 10–13 October 2010, IEEE (Oct 2010). http://dblp.org/db/conf/smc/smc2010.html#HartmannL10

  7. Hauptmann, A.G., McAvinney, P.: Gestures with speech for graphic manipulation. Int. J. Man Mach. Stud. 38(2), 231–249 (1993). http://linkinghub.elsevier.com/retrieve/doi/10.1006/imms.1993.1011

    Article  Google Scholar 

  8. Kim, D., Hilliges, O., Izadi, S., Butler, A.D., Chen, J., Oikonomidis, I., Olivier, P.: Digits: Freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, pp. 167–176. UIST 2012, ACM, New York, NY, USA (2012). http://doi.acm.org/10.1145/2380116.2380139

  9. Kim, J., Mastnik, S., André, E.: Emg-based hand gesture recognition for realtime biosignal interfacing. In: Proceedings of the 13th International Conference on Intelligent User Interfaces. IUI 2008, vol. 39, pp. 30–39. ACM Press, New York, NY, USA (2008). http://portal.acm.org/citation.cfm?doid=1378773.1378778

  10. Li, Y., Chen, X., Tian, J., Zhang, X., Wang, K., Yang, J.: Automatic recognition of sign language subwords based on portable accelerometer and emg sensors. In: International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction, pp. 17–1. ICMI-MLMI 2010, ACM, New York, NY, USA (2010). http://portal.acm.org/citation.cfm?id=1891926

  11. Mistry, P., Maes, P., Chang, L.: WUW-wear Ur world: a wearable gestural interface. In: Proceedings of CHI 2009, pp. 4111–4116 (2009). http://dl.acm.org/citation.cfm?id=1520626/npapers://c80d98e4-9a96-4487-8d06-8e1acc780d86/Paper/p10196

  12. Oviatt, S.: Ten myths of multimodal interaction. Commun. ACM 42(11), 74–81 (1999)

    Article  Google Scholar 

  13. Rabiner, L.: A tutorial on hidden markov models and selected applications in speech recognition. Proc. IEEE 77(2), 257–286 (1989)

    Article  Google Scholar 

  14. Rekimoto, J.: GestureWrist and GesturePad: unobtrusive wearable interaction devices. In: Proceedings Fifth International Symposium on Wearable Computers (2001)

    Google Scholar 

  15. Samadani, A.A., Kulic, D.: Hand gesture recognition based on surface electromyography. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (2014)

    Google Scholar 

  16. Saponas, T.S., Tan, D.S., Morris, D., Balakrishnan, R.: Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. pp. 515–524. CHI 2008, ACM Press, New York, New York, USA (2008). http://portal.acm.org/citation.cfm?doid=1357054.1357138

  17. Wolf, M.T., Assad, C., Stoica, A., You, K., Jethani, H., Vernacchia, M.T., Fromm, J., Iwashita, Y.: Decoding static and dynamic arm and hand gestures from the jpl biosleeve. In: 2013 IEEE Aerospace Conference, pp. 1–9 (2013). http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6497171&isnumber=6496810

  18. Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., Yang, J.: A framework for hand gesture recognition based on accelerometer and emg sensors. IEEE Trans. Syst., Man Cybern., Part A: Syst. Humans 41(6), 1064–1076 (2011)

    Article  Google Scholar 

Download references

Acknowledgements

This research was partly funded by Google through a Google Faculty Research Award.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marcus Georgi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Georgi, M., Amma, C., Schultz, T. (2015). Fusion and Comparison of IMU and EMG Signals for Wearable Gesture Recognition. In: Fred, A., Gamboa, H., Elias, D. (eds) Biomedical Engineering Systems and Technologies. BIOSTEC 2015. Communications in Computer and Information Science, vol 574. Springer, Cham. https://doi.org/10.1007/978-3-319-27707-3_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-27707-3_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-27706-6

  • Online ISBN: 978-3-319-27707-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics