Skip to main content

Advertisement

Log in

Robust orientation estimate via inertial guided visual sample consensus

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

This paper presents a novel orientation estimate approach named inertial guided visual sample consensus (IGVSAC). This method is intentionally designed for capturing the orientation of human body joints in free-living environments. Unlike the traditional visual-based orientation estimation methods, where outliers among putative image-pair correspondences are removed based on hypothesize-and-verify models such as the computationally costly RANSAC, our approach novelly exploits prior motion information (i.e., rotation and translation) deduced from the quick-response inertial measurement unit (IMU) as the initial body pose to assist camera in removing hidden outliers. In addition, our IGVSAC algorithm is able to ensure estimation accuracy even in the presence of a large quantity of outliers, thanks to its capability of rejecting apparent mismatches. The estimated orientation from the visual sensor is, in turn, able to correct long-term IMU drifts. We conducted extensive experiments to verify the effectiveness and robustness of our IGVSAC algorithm. Comparisons with highly accurate VICON and OptiTrack Motion Tracking Systems prove that our orientation estimate system is quite suitable for capturing human body joints.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. https://drive.google.com/file/d/0BybZR-0-vrrKU3lZX29iZzZvYzg/view

References

  1. Hardegger M, Roggen D, Troster G (2015) 3D ActionSLAM: wearable person tracking in multi-floor environments. Pers Ubiquit Comput 19(1):123–141

    Article  Google Scholar 

  2. Umek A, Kos A (2016) Validation of smartphone gyroscopes for mobile biofeedback applications. Pers Ubiquit Comput 20(5):657–666

    Article  Google Scholar 

  3. Leung T, Medioni G (2014) Visual navigation aid for the blind in dynamic environments. In: 2014 IEEE conference on computer vision and pattern recognition workshops (CVPR), pp 565– 572

    Google Scholar 

  4. Zhou H, Hu H (2008) Human motion tracking for rehabilitation a survey. Biomed Signal Process Control 3(1):1–18

    Article  Google Scholar 

  5. Chen X, Zhang J, Hamel W, Tan J (2014) An inertial-based human motion tracking system with twists and exponential maps. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 5665–5670

    Chapter  Google Scholar 

  6. Marin-Perianu R, Marin-Perianu M, Havinga P, Taylor S, Begg R, Palaniswami M, Rouffet D (2013) A performance analysis of a wireless body-area network monitoring system for professional cycling. Pers Ubiquit Comput 17(1):197–209

    Article  Google Scholar 

  7. Biekiewicz M, Rodger M, Young W, Craig C (2013) Time to get a move on: overcoming bradykinetic movement in Parkinson’s disease with artificial sensory guidance generated from biological motion. Behav Brain Res 25(3):113–120

    Article  Google Scholar 

  8. Kerl C, Sturm J, Cremers D (2013) Robust odometry estimation for RGB-D cameras. In: 2013 IEEE international conference on robotics and automation (ICRA), pp 3748–3754

    Chapter  Google Scholar 

  9. Tao Y, Hu H (2008) A novel sensing and data fusion system for 3-D arm motion tracking in telerehabilitation. IEEE Trans Instrum Meas 57(5):1029–1040

    Article  Google Scholar 

  10. Goulding J (2010) Biologically-inspired image-based sensor fusion approach to compensate gyro sensor drift in mobile robot systems that balance. In: 2010 IEEE conference on multisensor fusion and integration for intelligent systems (MFI), pp 102– 108

    Chapter  Google Scholar 

  11. Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381– 395

    Article  MathSciNet  Google Scholar 

  12. Torr P, Zisserman A (2000) MLESAC: a new robust estimator with application to estimating image geometry. Comput Vis Image Und 78(1):138–156

    Article  Google Scholar 

  13. Gupta M, Gao J, Aggarwal C, Han J (2014) Outlier detection for temporal data. Synt Lect Data Min Knowl Disc 5(1):1–129

    Article  MATH  Google Scholar 

  14. Lowe D (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–110

    Article  Google Scholar 

  15. Bay H, Tuytelaars T, VanGool L (2006) Surf: speeded up robust features. In: European conference on computer vision (ECCV), pp 404–417

    Google Scholar 

  16. Ma J, Zhao J, Tian J, Yuille A, Tu Z (2014) Robust point matching via vector field consensus. IEEE Trans Image Process 23(4):1706–1721

    Article  MathSciNet  MATH  Google Scholar 

  17. He H, Li Y, Guan Y, Tan J (2015) Wearable Ego-Motion tracking for blind navigation in indoor environments. IEEE Trans Auto Sci Eng 12(4):1181–1190

    Article  Google Scholar 

  18. Tian Y, Hamel W, Tan J (2014) Accurate human navigation using wearable monocular visual and inertial sensors. IEEE Trans Instrum Meas 63(1):203–213

    Article  Google Scholar 

  19. Tian Y, Wei H, Tan J (2013) An adaptive-gain complementary filter for real-time human motion tracking with marg sensors in free-living environments. IEEE Trans Neural Syst Rehabil Eng 21(2):254–264

    Article  Google Scholar 

  20. Troiani C, Martinelli A, Laugier C, Scaramuzza D (2014) 2-point-based outlier rejection for camera-imu systems with applications to micro aerial vehicles. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 5530– 5536

    Chapter  Google Scholar 

  21. Hartley R, Zisserman A (2003) Multiple view geometry in computer vision. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  22. Murray R, Li Z, Sastry S, Sastry S (1994) A mathematical introduction to robotic manipulation. CRC press

  23. Furgale P, Rehder J, Siegwart R (2013) Unified temporal and spatial calibration for multi-sensor systems. In: 2013 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 1280–1286

    Chapter  Google Scholar 

  24. Li X, Hu Z (2010) Rejecting mismatches by correspondence function. Int J Comput Vis 89(1):1–17

    Article  Google Scholar 

  25. Bouguet J (2004) Camera calibration toolbox for matlab

  26. Madgwick S, Harrison A, Vaidyanathan R (2011) Estimation of IMU and MARG orientation using a gradient descent algorithm. In: 2011 IEEE international conference on rehabilitation robotics, pp 1–7

    Google Scholar 

  27. Scaramuzza D, Fraundorfer F (2011) Visual odometry tutorial. IEEE Robot Autom Mag 18(4):80–92

    Article  Google Scholar 

  28. Steinbrcker F, Sturm J, Cremers D (2011) Real-time visual odometry from dense RGB-D images. In: 2011 IEEE international conference on computer vision workshops (ICCV), pp 719–722

    Chapter  Google Scholar 

  29. Panahandeh G, Jansson M, Hutchinson S (2013) IMU-camera data fusion: Horizontal plane observation with explicit outlier rejection. In: 2013 international conference on indoor positioning and indoor navigation (IPIN), pp 1–9

    Google Scholar 

  30. Shirmohammadi S, Ferrero A (2014) Camera as the instrument: the rising trend of vision based measurement. IEEE Trans Instrum Meas 17(3):41–47

    Article  Google Scholar 

  31. Hu J, Sun K (2015) A robust orientation estimation algorithm using MARG sensors. IEEE Trans Instrum Meas 64(3):815–822

    Article  Google Scholar 

  32. Shiratori T, Park H, Sigal L, Sheikh Y, Hodgins J (2011) Motion capture from body-mounted cameras. ACM Trans Graph 30(4):1–10

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Guoli Song for his assistance in conducting MC-IMU motion tracking experiments. Also, the authors would like to thank Doctor Bo Yang for his considerate suggestions and those peer reviewers who give substantially valuable advices on this paper. This work was supported by the National Science Foundation of China under contact 61233007, 61673371 and 71661147005, Youth Innovation Promotion Association, CAS (2015157).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Wei Liang or Jindong Tan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, Y., Liang, W., Li, Y. et al. Robust orientation estimate via inertial guided visual sample consensus. Pers Ubiquit Comput 22, 259–274 (2018). https://doi.org/10.1007/s00779-017-1040-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-017-1040-2

Keywords

Navigation