ABSTRACT
This paper presents ongoing research work on developing a protocol framework for human motion recognition using complex and continuous 3D motion data into more intuitive 2D trajectory representation based-on the quaternion visualization. Quaternions are very compact and free from gimbal lock for representing orientations and rotations of objects in 3D space. In this study, the focus is only on the arm orientation and not the position. In our pilot experimental evaluation, we examine our approach to visually recognize several biceps curl using quaternions data collected using wireless inertial sensors attached to the human arm. The results of the analysis indicate that the proposed framework makes it possible to represent 3D motion data in the form of a 2D trajectory for continuous motion patterns.
- Sandip Agrawal, Ionut Constandache, Shravan Gaonkar, Romit Roy Choudhury, Kevin Caves, and Frank DeRuyter. 2011. Using mobile phones to write in air. In Proceedings of the 9th international conference on Mobile systems, applications, and services. ACM, 15--28. Google ScholarDigital Library
- Shamir Alavi, Dennis Arsenault, and Anthony Whitehead. 2016. Quaternion-Based Gesture Recognition Using Wireless Wearable Motion Capture Sensors. Sensors 16, 5 (2016). http://www.mdpi.eom/1424-8220/16/5/605Google Scholar
- Antal K Bejczy. 1980. Sensors, controls, and man-machine interface for advanced teleoperation. Science 208, 4450 (1980), 1327--1335.Google Scholar
- W-T Chen and H-R Chuang. 1998. Numerical computation of the EM coupling between a circular loop antenna and a full-scale human-body model. IEEE Transactions on Microwave Theory and Techniques 46, 10 (1998), 1516--1520.Google ScholarCross Ref
- Zen Chen and H-J Lee. 1992. Knowledge-guided visual perception of 3-D human gait from a single image sequence. IEEE transactions on Systems, Man, and Cybernetics 22, 2 (1992), 336--342.Google Scholar
- Gabe Cohn, Sidhant Gupta, Tien-Jui Lee, Dan Morris, Joshua R Smith, Matthew S Reynolds, Desney S Tan, and Shwetak N Patel. 2012. An ultra-low-power human body motion sensor using static electric field sensing. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 99--102. Google ScholarDigital Library
- Gabe Cohn, Daniel Morris, Shwetak Patel, and Desney Tan. 2012. Humantenna: using the body as an antenna for real-time whole-body interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1901--1910. Google ScholarDigital Library
- Gabe Cohn, Daniel Morris, Shwetak N. Patel, and Desney S. Tan. 2011. Your Noise is My Command: Sensing Gestures Using the Body As an Antenna. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 791--800. Google ScholarDigital Library
- Francisco Escolano, Miguel Cazorla, Domingo Gallardo, and Ramón Rizo. 1997. Deformable templates for tracking and analysis of intravascular ultrasound sequences. In International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition. Springer, 521--534. Google ScholarDigital Library
- Alessandro Filippeschi, Norbert Schmitz, Markus Miezal, Gabriele Bleser, Emanuele Ruffaldi, and Didier Stricker. 2017. Survey of Motion Tracking Methods Based on Inertial Sensors: A Focus on Upper Limb Human Motion. Sensors 17, 6 (2017).Google Scholar
- Andrew J. Hanson. 2006. Visualizing Quaternions. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA. Google ScholarDigital Library
- Yinghao Huang, Manuel Kaufmann, Emre Aksan, Michael J Black, Otmar Hilliges, and Gerard Pons-Moll. 2018. Deep inertial poser: learning to reconstruct human pose from sparse inertial measurements in real time. In SIGGRAPH Asia 2018 Technical Papers. ACM, 185. Google ScholarDigital Library
- KitwareInc. 2019. Visualization Toolkit. (Mar 2019). https://vtk.org/about/#overviewGoogle Scholar
- Hsi-Jian Lee and Zen Chen. 1985. Determination of 3D human body postures from a single view. Computer Vision, Graphics, and Image Processing 30, 2 (1985), 148--168.Google ScholarCross Ref
- Matjaž Mihelj. 2006. Inverse kinematics of human arm based on multisensor data integration. Journal of Intelligent and Robotic Systems 47, 2 (2006), 139--153. Google ScholarDigital Library
- Sushmita Mitra and Tinku Acharya. 2007. Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 37, 3 (2007), 311--324. Google ScholarDigital Library
- Thomas B Moeslund and Erik Granum. 2001. A survey of computer vision-based human motion capture. Computer vision and image understanding 81, 3 (2001), 231--268. Google ScholarDigital Library
- Ramakrishnan Mukundan. 2002. Quaternions: From classical mechanics to computer graphics, and beyond. In Proceedings of the 7th Asian Technology conference in Mathematics. 97--105.Google Scholar
- Noitom. 2019. Perception Neuron. (Mar 2019). https://neuronmocap.com/Google Scholar
- Shashidhar Patil, Harinadha Reddy Chintalapalli, Dubeom Kim, and Youngho Chai. 2015. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars. Sensors 15, 6 (2015), 14435--14457.Google ScholarCross Ref
- Shashidhar Patil, Dubeom Kim, Seongsill Park, and Youngho Chai. 2016. Handwriting recognition in free space using wimu-based hand motion analysis. Journal of Sensors 2016 (2016).Google Scholar
- Gerard Pons-Moll, Andreas Baak, Juergen Gall, Laura Leal-Taixe, Meinard Mueller, Hans-Peter Seidel, and Bodo Rosenhahn. 2011. Outdoor human motion capture using inverse kinematics and von mises-fisher sampling. In 2011 International Conference on Computer Vision. IEEE, 1243--1250. Google ScholarDigital Library
- Boris I Prilutsky and Vladimir M Zatsiorsky. 2002. Optimization-based models of muscle coordination. Exercise and sport sciences reviews 30, 1 (2002), 32.Google Scholar
- Mike Topping. 2001. Handy 1, a robotic aid to independence for severely disabled people. In 7th IntâĂŹl Conf. on Rehabilitation Robotics. 142--147.Google Scholar
- Chunya Wang, Xiang Li, Enlai Gao, Muqiang Jian, Kailun Xia, Qi Wang, Zhiping Xu, Tianling Ren, and Yingying Zhang. 2016. Carbonized silk fabric for ultrastretchable, highly sensitive, and wearable strain sensors. Advanced materials 28, 31 (2016), 6640--6648.Google Scholar
- Wikimedia. 2019a. Gesture. (Mar 2019). https://en.wikipedia.org/wiki/GestureGoogle Scholar
- Wikimedia. 2019b. Lockheed SR-71 Blackbird - Astro inertial navigation system. (Mar 2019). https://en.wikipedia.org/wiki/Lockheed_SR-71_Blackbird#Astro-inertial_navigation_systemGoogle Scholar
- Wikimedia. 2019c. UV mapping. (Mar 2019). https://en.wikipedia.org/wiki/UV_mappingGoogle Scholar
- Xsens. 2019. The leading innovator in 3D motion tracking technology. (Mar 2019). https://www.xsens.comGoogle Scholar
- Alexander David Young. 2010. Use of body model constraints to improve accuracy of inertial motion capture. In 2010 International Conference on Body Sensor Networks. IEEE, 180--186. Google ScholarDigital Library
- Yangguang Yu, Xiangke Wang, Zhiwei Zhong, and Yongwei Zhang. 2017. ROS-based UAV control using hand gesture recognition. In 2017 29th Chinese Control And Decision Conference (CCDC). IEEE, 6795--6799.Google ScholarCross Ref
- Shaghayegh Zihajehzadeh and Edward J Park. 2017. A novel biomechanical model-aided IMU/UWB fusion for magnetometer-free lower body motion capture. IEEE Transactions on Systems, Man, and Cybernetics: Systems 47, 6 (2017), 927--938.Google ScholarCross Ref
Recommendations
Finger Gesture Tracking for Interactive Applications: A Pilot Study with Sign Languages
This paper presents FinGTrAC, a system that shows the feasibility of fine grained finger gesture tracking using low intrusive wearable sensor platform (smart-ring worn on the index finger and a smart-watch worn on the wrist). The key contribution is in ...
Continuous recognition of one-handed and two-handed gestures using 3D full-body motion tracking sensors
IUI '12: Proceedings of the 2012 ACM international conference on Intelligent User InterfacesIn this paper we present a new bimanual markerless gesture interface for 3D full-body motion tracking sensors, such as the Kinect. Our interface uses a probabilistic algorithm to incrementally predict users' intended one-handed and twohanded gestures ...
Upper limb motion tracking with the integration of IMU and Kinect
Upper limb motion tracking attracts attentions from both academia and industry due to its value in a wide range of applications. Although existing optical-based tracking techniques can provide accurate tracking results, the product cost and complexity ...
Comments