Abstract
This paper presents a novel orientation estimate approach named inertial guided visual sample consensus (IGVSAC). This method is intentionally designed for capturing the orientation of human body joints in free-living environments. Unlike the traditional visual-based orientation estimation methods, where outliers among putative image-pair correspondences are removed based on hypothesize-and-verify models such as the computationally costly RANSAC, our approach novelly exploits prior motion information (i.e., rotation and translation) deduced from the quick-response inertial measurement unit (IMU) as the initial body pose to assist camera in removing hidden outliers. In addition, our IGVSAC algorithm is able to ensure estimation accuracy even in the presence of a large quantity of outliers, thanks to its capability of rejecting apparent mismatches. The estimated orientation from the visual sensor is, in turn, able to correct long-term IMU drifts. We conducted extensive experiments to verify the effectiveness and robustness of our IGVSAC algorithm. Comparisons with highly accurate VICON and OptiTrack Motion Tracking Systems prove that our orientation estimate system is quite suitable for capturing human body joints.
Similar content being viewed by others
References
Hardegger M, Roggen D, Troster G (2015) 3D ActionSLAM: wearable person tracking in multi-floor environments. Pers Ubiquit Comput 19(1):123–141
Umek A, Kos A (2016) Validation of smartphone gyroscopes for mobile biofeedback applications. Pers Ubiquit Comput 20(5):657–666
Leung T, Medioni G (2014) Visual navigation aid for the blind in dynamic environments. In: 2014 IEEE conference on computer vision and pattern recognition workshops (CVPR), pp 565– 572
Zhou H, Hu H (2008) Human motion tracking for rehabilitation a survey. Biomed Signal Process Control 3(1):1–18
Chen X, Zhang J, Hamel W, Tan J (2014) An inertial-based human motion tracking system with twists and exponential maps. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 5665–5670
Marin-Perianu R, Marin-Perianu M, Havinga P, Taylor S, Begg R, Palaniswami M, Rouffet D (2013) A performance analysis of a wireless body-area network monitoring system for professional cycling. Pers Ubiquit Comput 17(1):197–209
Biekiewicz M, Rodger M, Young W, Craig C (2013) Time to get a move on: overcoming bradykinetic movement in Parkinson’s disease with artificial sensory guidance generated from biological motion. Behav Brain Res 25(3):113–120
Kerl C, Sturm J, Cremers D (2013) Robust odometry estimation for RGB-D cameras. In: 2013 IEEE international conference on robotics and automation (ICRA), pp 3748–3754
Tao Y, Hu H (2008) A novel sensing and data fusion system for 3-D arm motion tracking in telerehabilitation. IEEE Trans Instrum Meas 57(5):1029–1040
Goulding J (2010) Biologically-inspired image-based sensor fusion approach to compensate gyro sensor drift in mobile robot systems that balance. In: 2010 IEEE conference on multisensor fusion and integration for intelligent systems (MFI), pp 102– 108
Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381– 395
Torr P, Zisserman A (2000) MLESAC: a new robust estimator with application to estimating image geometry. Comput Vis Image Und 78(1):138–156
Gupta M, Gao J, Aggarwal C, Han J (2014) Outlier detection for temporal data. Synt Lect Data Min Knowl Disc 5(1):1–129
Lowe D (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–110
Bay H, Tuytelaars T, VanGool L (2006) Surf: speeded up robust features. In: European conference on computer vision (ECCV), pp 404–417
Ma J, Zhao J, Tian J, Yuille A, Tu Z (2014) Robust point matching via vector field consensus. IEEE Trans Image Process 23(4):1706–1721
He H, Li Y, Guan Y, Tan J (2015) Wearable Ego-Motion tracking for blind navigation in indoor environments. IEEE Trans Auto Sci Eng 12(4):1181–1190
Tian Y, Hamel W, Tan J (2014) Accurate human navigation using wearable monocular visual and inertial sensors. IEEE Trans Instrum Meas 63(1):203–213
Tian Y, Wei H, Tan J (2013) An adaptive-gain complementary filter for real-time human motion tracking with marg sensors in free-living environments. IEEE Trans Neural Syst Rehabil Eng 21(2):254–264
Troiani C, Martinelli A, Laugier C, Scaramuzza D (2014) 2-point-based outlier rejection for camera-imu systems with applications to micro aerial vehicles. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 5530– 5536
Hartley R, Zisserman A (2003) Multiple view geometry in computer vision. Cambridge University Press, Cambridge
Murray R, Li Z, Sastry S, Sastry S (1994) A mathematical introduction to robotic manipulation. CRC press
Furgale P, Rehder J, Siegwart R (2013) Unified temporal and spatial calibration for multi-sensor systems. In: 2013 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 1280–1286
Li X, Hu Z (2010) Rejecting mismatches by correspondence function. Int J Comput Vis 89(1):1–17
Bouguet J (2004) Camera calibration toolbox for matlab
Madgwick S, Harrison A, Vaidyanathan R (2011) Estimation of IMU and MARG orientation using a gradient descent algorithm. In: 2011 IEEE international conference on rehabilitation robotics, pp 1–7
Scaramuzza D, Fraundorfer F (2011) Visual odometry tutorial. IEEE Robot Autom Mag 18(4):80–92
Steinbrcker F, Sturm J, Cremers D (2011) Real-time visual odometry from dense RGB-D images. In: 2011 IEEE international conference on computer vision workshops (ICCV), pp 719–722
Panahandeh G, Jansson M, Hutchinson S (2013) IMU-camera data fusion: Horizontal plane observation with explicit outlier rejection. In: 2013 international conference on indoor positioning and indoor navigation (IPIN), pp 1–9
Shirmohammadi S, Ferrero A (2014) Camera as the instrument: the rising trend of vision based measurement. IEEE Trans Instrum Meas 17(3):41–47
Hu J, Sun K (2015) A robust orientation estimation algorithm using MARG sensors. IEEE Trans Instrum Meas 64(3):815–822
Shiratori T, Park H, Sigal L, Sheikh Y, Hodgins J (2011) Motion capture from body-mounted cameras. ACM Trans Graph 30(4):1–10
Acknowledgements
The authors would like to thank Guoli Song for his assistance in conducting MC-IMU motion tracking experiments. Also, the authors would like to thank Doctor Bo Yang for his considerate suggestions and those peer reviewers who give substantially valuable advices on this paper. This work was supported by the National Science Foundation of China under contact 61233007, 61673371 and 71661147005, Youth Innovation Promotion Association, CAS (2015157).
Author information
Authors and Affiliations
Corresponding authors
Rights and permissions
About this article
Cite this article
Zhang, Y., Liang, W., Li, Y. et al. Robust orientation estimate via inertial guided visual sample consensus. Pers Ubiquit Comput 22, 259–274 (2018). https://doi.org/10.1007/s00779-017-1040-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-017-1040-2