Deep Neural Network for Electromyography Signal Classification via Wearable Sensors

Deep Neural Network for Electromyography Signal Classification via Wearable Sensors

Ying Chang, Lan Wang, Lingjie Lin, Ming Liu
Copyright: © 2022 |Volume: 13 |Issue: 3 |Pages: 11
ISSN: 1947-3532|EISSN: 1947-3540|EISBN13: 9781683181835|DOI: 10.4018/IJDST.307988
Cite Article Cite Article

MLA

Chang, Ying, et al. "Deep Neural Network for Electromyography Signal Classification via Wearable Sensors." IJDST vol.13, no.3 2022: pp.1-11. http://doi.org/10.4018/IJDST.307988

APA

Chang, Y., Wang, L., Lin, L., & Liu, M. (2022). Deep Neural Network for Electromyography Signal Classification via Wearable Sensors. International Journal of Distributed Systems and Technologies (IJDST), 13(3), 1-11. http://doi.org/10.4018/IJDST.307988

Chicago

Chang, Ying, et al. "Deep Neural Network for Electromyography Signal Classification via Wearable Sensors," International Journal of Distributed Systems and Technologies (IJDST) 13, no.3: 1-11. http://doi.org/10.4018/IJDST.307988

Export Reference

Mendeley
Favorite Full-Issue Download

Abstract

The human-computer interaction has been widely used in many fields, such intelligent prosthetic control, sports medicine, rehabilitation medicine, and clinical medicine. It has gradually become a research focus of social scientists. In the field of intelligent prosthesis, sEMG signal has become the most widely used control signal source because it is easy to obtain. The off-line sEMG control intelligent prosthesis needs to recognize the gestures to execute associated action. In order solve this issue, this paper adopts a CNN plus BiLSTM to automatically extract sEMG features and recognize the gestures. The CNN plus BiLSTM can overcome the drawbacks in the manual feature extraction methods. The experimental results show that the proposed gesture recognition framework can extract overall gesture features, which can improve the recognition rate.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.