Skip to main content
Log in

Ego-motion estimation concepts, algorithms and challenges: an overview

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Ego-motion technology holds great significance for computer vision applications, robotics, augmented reality and visual simultaneous localization and mapping. This paper is a study of ego-motion estimation basic concepts, equipment, algorithms, challenges and its real world applications. First, we provide an overview for motion estimation in general with special focus on ego-motion estimation and its basic concepts. For ego-motion estimation it’s necessary to understand the notion of independent moving objects, focus of expansion, motion field, and optical flow. Vital algorithms that are used for ego-motion estimation are critically discussed in the following section of the paper. Various camera setups and their potential weakness and strength are also studied in context of ego-motion estimation. We also briefly specify some ego-motion applications used in the real world. We conclude the paper by discussing some open problems, provide some future directions and finally summarize the entire paper in the conclusions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Agrawal P, Carreira J, Malik J (2015) Learning to see by moving, In Proceedings of the IEEE International Conference on Computer Vision, pp 37–45

  2. Álvarez León LM, Esclarín Monreal J, Lefébure M, Sánchez Pérez J (1999) A PDE model for computing the optical flow, Proceedings of CEDYA XVI. - Las Palmas: University of Las Palmas, pp 1349–1356

  3. Ayvaci A, Raptis M, Soatto S (2012) Sparse occlusion detection with optical flow. Int J Comput Vis 97:322–338

    Article  MathSciNet  MATH  Google Scholar 

  4. Badino, H (2007) A robust approach for ego-motion estimation using a mobile stereo platform, In Complex Motion, ed: Springer, pp 198–208

  5. Badino H (2009) Binocular ego-motion estimation for automotive applications, Goethe University Frankfurt am Main

  6. Baik YK, Kwon J, Lee HS, Lee KM (2013) Geometric particle swarm optimization for robust visual ego-motion estimation via particle filtering. Image Vis Comput 31:565–579

    Article  Google Scholar 

  7. Barron JL, Fleet DJ, Beauchemin SS (1994) Performance of optical flow techniques. Int J Comput Vis 12:43–77

    Article  Google Scholar 

  8. Black MJ, Anandan P (1996) The robust estimation of multiple motions: Parametric and piecewise-smooth flow fields. Comput Vis Image Underst 63:75–104

    Article  Google Scholar 

  9. Brandt T, Büchele W, Arnold F (1977) Arthrokinetic nystagmus and ego-motion sensation. Exp Brain Res 30:331–338

    Google Scholar 

  10. Briod A, Zufferey J-C, Floreano D (2016) A method for ego-motion estimation in micro-hovering platforms flying in very cluttered environments. Auton Robot 40:789–803

    Article  Google Scholar 

  11. Brox T, Bruhn A, Papenberg N, Weickert, J (2004) High accuracy optical flow estimation based on a theory for warping, In Computer Vision-ECCV 2004, ed: Springer, pp 25–36

  12. Bruss AR, Horn BK (1983) Passive navigation. Computer Vision, Graphics, and Image Processing 21:3–20

    Article  Google Scholar 

  13. Burger W, Bhanu B (1989) On computing afuzzy’focus of expansion for autonomous navigation, In Computer Vision and Pattern Recognition, 1989 Proceedings CVPR’89, IEEE Computer Society Conference on, pp 563–568

  14. Burger W, Bhanu B (1990) Estimating 3D egomotion from perspective image sequence. IEEE Trans Pattern Anal Mach Intell 12:1040–1058

    Article  Google Scholar 

  15. Campbell J, Sukthankar R, Nourbakhsh I, Pahwa A (2005) A robust visual odometry and precipice detection system using consumer-grade monocular vision, In Robotics and Automation, 2005 ICRA 2005 Proceedings of the 2005 I.E. International Conference on, pp 3421–3427

  16. Cao Y, Cook P, Renfrew, A (2007) Vehicle ego-motion estimation by using pulse-coupled neural network, In Machine Vision and Image Processing Conference, 2007. IMVIP 2007. International, pp 185–191

  17. Chang P, Hebert M (2000) Omni-directional structure from motion, In Omnidirectional Vision, 2000 Proceedings IEEE Workshop on, pp 127–133

  18. Cheng Y, Maimone M, Matthies L (2005) Visual odometry on the Mars exploration rovers, In Systems, Man and Cybernetics, 2005 I.E. International Conference on, pp 903–910

  19. Costante G, Mancini M, Valigi P, Ciarfuglia TA (2016) Exploring representation learning with cnns for frame-to-frame ego-motion estimation. IEEE Robotics and Automation Letters, 1:18–25

    Article  Google Scholar 

  20. Da Silva HMG (2014) A probabilistic approach for stereo visual egomotion, Ph.D. dissertation, Instituto Superior Technico, Universidade De Lisboa

  21. Deriche R, Kornprobst P, Aubert G (1996) Optical-flow estimation while preserving its discontinuities: A variational approach, In Recent Developments in Computer Vision, ed: Springer, pp 69–80

  22. Dornaika F, Chung C-KR (2003) Stereo geometry from 3D ego-motion streams. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 33:308–323

    Article  Google Scholar 

  23. Dornaika F, Sappa AD (2007) Real-time vehicle ego-motion using stereo pairs and particle filters, In Image Analysis and Recognition, ed: Springer, pp 469–480

  24. Endres F, Sprunk C, Kummerle R, Burgard W (2014) A catadioptric extension for RGB-D cameras, In Intelligent Robots and Systems (IROS 2014), 2014 IEEE/RSJ International Conference on, pp 466–471

  25. Ess A, Leibe B, Schindler K, Gool LV (2008) A mobile vision system for robust multi-person tracking, In Computer Vision and Pattern Recognition, 2008 CVPR 2008 I.E. Conference on, pp 1–8

  26. Franke U, Rabe C, Badino H, Gehrig S (2005) 6d-vision: Fusion of stereo and motion for robust environment perception, In Pattern Recognition, ed: Springer, pp 216–223

  27. Fredriksson J, Enqvist O, Kahl F (2014) Fast and reliable two-view translation estimation, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 1606–1612

  28. Fredriksson J, Larsson V, Olsson C (2015) Practical robust two-view translation estimation, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 2684–2690

  29. Gandhi T, Trivedi M (2005) Parametric ego-motion estimation for vehicle surround analysis using an omnidirectional camera. Mach Vis Appl 16:85–95

    Article  Google Scholar 

  30. Gibson JJ (1970) On theories for visual space perception. Scand J Psychol 11:75–79

    Article  Google Scholar 

  31. Gillner WJ (1995) Motion based vehicle detection on motorways, In Intelligent Vehicles’ 95 Symposium, Proceedings of the, pp 483–487

  32. Gluckman J, Nayar, SK (1998) Ego-motion and omnidirectional cameras, In Computer Vision, 1998. Sixth International Conference on, pp 999–1005

  33. Goecke R, Asthana A, Pettersson N, Petersson L (2007) Visual vehicle egomotion estimation using the fourier-mellin transform, in Intelligent Vehicles Symposium, 2007 IEEE, pp 450–455

  34. Hariyono J, Hoang V-D, Jo, K-H (2014) Human detection from mobile omnidirectional camera using ego-motion compensated, In Intelligent Information and Database Systems, ed: Springer, pp 553–560

  35. Heeger DJ, Jepson AD (1992) Subspace methods for recovering rigid motion I: algorithm and implementation. Int J Comput Vis 7:95–117

    Article  Google Scholar 

  36. Hildreth EC (1992) Recovering heading for visually-guided navigation. Vis Res 32:1177–1192

    Article  Google Scholar 

  37. Horn BK, Schunck BG (1981) Determining optical flow, In 1981 Technical Symposium East, pp 319–331

  38. Horn BK, Weldon Jr E (1988) Direct methods for recovering motion. Int J Comput Vis 2:51–76

    Article  Google Scholar 

  39. Humayun A, Mac Aodha O, Brostow GJ (2011) Learning to find occlusion regions, In Computer Vision and Pattern Recognition (CVPR), 2011 I.E. Conference on, pp 2161–2168

  40. Irani M, Rousso B, Peleg S (1994) Recovery of ego-motion using image stabilization, In Computer Vision and Pattern Recognition, 1994 Proceedings CVPR’94, 1994 I.E. Computer Society Conference on, pp 454–460

  41. Iyer RV, He Z, Chandler PR (2006) On the computation of the ego-motion and distance to obstacles for a micro air vehicle, In American Control Conference, 2006, p 6 pp

  42. Jain R, Bartlett SL, O’Brien, N (1987) Motion stereo using ego-motion complex logarithmic mapping, Pattern Analysis and Machine Intelligence, IEEE Transactions on, pp 356–369

  43. Jain R, Kasturi R, Schunck BG (1995) Machine vision. McGraw-Hill, New York

  44. Jayaraman D, Grauman K (2015) Learning image representations tied to ego-motion, In Proceedings of the IEEE International Conference on Computer Vision, pp 1413–1421

  45. Jepson AD, Heeger DJ (1991) A fast subspace algorithm for recovering rigid motion, In Visual Motion, 1991, Proceedings of the IEEE Workshop on, pp 124–131

  46. Jepson AD, Heeger DJ (1992) Linear subspace methods for recovering translational direction, Spatial Vision in Humans and Robots, pp 39–62

  47. Jung B, Sukhatme GS (2004) Detecting moving objects using a single camera on a mobile robot in an outdoor environment, In International Conference on Intelligent Autonomous Systems, pp 980–987

  48. Jung S-H, Eledath J, Johansson S, Mathevon V (2007) Egomotion estimation in monocular infra-red image sequence for night vision applications, In Applications of Computer Vision, 2007 WACV’07 I.E. Workshop on, pp 8–8

  49. Kanatani K (1993) 3-D interpretation of optical flow by renormalization. Int J Comput Vis 11:267–282

    Article  Google Scholar 

  50. Karlsson N, Di Bernardo E, Ostrowski J, Goncalves L, Pirjanian P, Munich, ME (2005) The vSLAM algorithm for robust localization and mapping, In Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 I.E. International Conference on, pp 24–29

  51. Kellner D, Barjenbruch M, J. Klappstein, J. Dickmann, and K. Dietmayer, (2014) Instantaneous ego-motion estimation using multiple Doppler radars, In Robotics and Automation (ICRA), 2014 I.E. International Conference on, pp 1592–1597

  52. Kim J-H, Li H, Hartley R (2010) Motion estimation for nonoverlapping multicamera rigs: linear algebraic and l∞ geometric solutions. IEEE Trans Pattern Anal Mach Intell 32:1044–1059

    Article  Google Scholar 

  53. Klappstein J, Stein F, Franke, U (2006) Monocular motion detection using spatial constraints in a unified manner, In Intelligent Vehicles Symposium, 2006 IEEE, pp 261–267

  54. Konolige K, Agrawal M, Sola J (2011) Large-scale visual odometry for rough terrain, In Robotics Research, ed: Springer, pp 201–212

  55. Koyasu H, Miura J, Shirai Y (2002) Recognizing moving obstacles for robot navigation using real-time omnidirectional stereo vision. Image 2:1

    Google Scholar 

  56. Lauer, M (2007) Ego-motion estimation and collision detection for omnidirectional robots, In RoboCup 2006: Robot soccer world cup X, ed: Springer, pp 466–473

  57. Levin A, Szeliski R (2004) Visual odometry and map correlation. Computer Vision and Pattern Recognition, 2004 CVPR 2004 Proceedings of the 2004 I.E. Computer Society Conference on 1:I-611–I-618

    Article  Google Scholar 

  58. Lim JJK (2010) Egomotion estimation with large field-of-view vision, PhD thesis

  59. MacLean WJ, Jepson AD, Frecker RC (1994) Recovery of egomotion and segmentation of independent object motion using the EM algorithm, BMVC, pp 1–10

  60. Maimone M, Cheng Y, Matthies L (2007) Two years of visual odometry on the mars exploration rovers. Journal of Field Robotics 24:169–186

    Article  Google Scholar 

  61. Maki A, Wiles C (2000) Geotensity constraint for 3D surface reconstruction under multiple light sources, In European Conference on Computer Vision, pp 725–741.

  62. Mandelbaum R, Salgian G, Sawhney H (1999) Correlation-based estimation of ego-motion and structure from motion and stereo, In Computer Vision, 1999 The Proceedings of the Seventh IEEE International Conference on, pp 544–550

  63. Markovic I, Chaumette F, Petrovic I (2014) Moving object detection, tracking and following using an omnidirectional camera on a mobile robot, In IEEE Int. Conf. on Robotics and Automation, ICRA’14

  64. McQuirk IS, Lee H-S, Horn B (1997) An analog VLSI chip for estimating the focus of expansion, In Solid-State Circuits Conference, 1997 Digest of Technical Papers. 43rd ISSCC, 1997 I.E. International, pp 40–41

  65. Milella, A, Siegwart, R (2006) Stereo-based ego-motion estimation using pixel tracking and iterative closest point, In Computer Vision Systems, 2006 ICVS’06. IEEE International Conference on, pp 21–21

  66. Munguia R, Grau A (2007) Monocular SLAM for visual odometry, In Intelligent Signal Processing, 2007. WISP 2007. IEEE International Symposium on, pp 1–6

  67. Nagel H-H, Enkelmann W (1986) An investigation of smoothness constraints for the estimation of displacement vector fields from image sequences, Pattern Analysis and Machine Intelligence, IEEE Transactions on, pp 565–593

  68. Nayar SK (1997) Catadioptric omnidirectional camera, In Computer Vision and Pattern Recognition, 1997. Proceedings., 1997 I.E. Computer Society Conference on, pp 482–488

  69. Negahdaripour S, Horn BK (1989) A direct method for locating the focus of expansion. Computer Vision, Graphics, and Image Processing 46:303–326

    Article  Google Scholar 

  70. Neisser U (1977) Gibson’s ecological optics: consequences of a different stimulus description. J Theory Soc Behav 7:17–28

  71. Nistér D, Naroditsky O, Bergen J (2004) Visual odometry, Computer Vision and Pattern Recognition, 2004. CVPR 2004. Proceedings of the 2004 I.E. Computer Society Conference on 1:I-652–I-659

    Google Scholar 

  72. Olson CF, Matthies LH, Schoppers M, Maimone MW (2000) Robust stereo ego-motion for long distance navigation, In Computer Vision and Pattern Recognition, 2000 Proceedings IEEE Conference on, pp 453–458

  73. Olson CF, Matthies LH, Schoppers M, Maimone MW (2001) Stereo ego-motion improvements for robust rover navigation, In Robotics and Automation, 2001 Proceedings 2001 ICRA IEEE International Conference on, pp 1099–1104

  74. Olson CF, Matthies LH, Schoppers M, Maimone MW (2003) Rover navigation using stereo ego-motion. Robot Auton Syst 43:215–229

    Article  Google Scholar 

  75. Prazdny K (1979) Motion and structure from optical flow. Proceedings of the 6th International Joint Conference on Artificial Intelligence 2:702–704

    Google Scholar 

  76. Prazdny K (1980) Egomotion and relative depth map from optical flow. Biol Cybern 36:87–102

    Article  MathSciNet  MATH  Google Scholar 

  77. Raudies F, Neumann H (2009) An efficient linear method for the estimation of ego-motion from optical flow, In Joint Pattern Recognition Symposium, pp 11–20

  78. Raudies F, Neumann H (2012) A review and evaluation of methods estimating ego-motion. Comput Vis Image Underst 116:606–633

    Article  Google Scholar 

  79. Rieger J, Lawton D (1985) Processing differential image motion. JOSA A 2:354–359

    Article  Google Scholar 

  80. Scaramuzza D, Siegwart R (2008) Appearance-guided monocular omnidirectional visual odometry for outdoor ground vehicles. IEEE Trans Robot 24:1015–1026

    Article  Google Scholar 

  81. Scaramuzza D, Martinelli A, Siegwart R (2006) A flexible technique for accurate omnidirectional camera calibration and structure from motion, In Computer Vision Systems, 2006 ICVS’06 I.E. International Conference on, pp 45–45

  82. Schmid K, Hirschmuller H (2013) Stereo vision and IMU based real-time ego-motion and depth image computation on a handheld device, In Robotics and Automation (ICRA), 2013 I.E. International Conference on, pp 4671–4678

  83. Schnorr C (1994) Segmentation of visual motion by minimizing convex non-quadratic functionals, In Pattern Recognition, 1994. Vol. 1-Conference A: Computer Vision & Image Processing., Proceedings of the 12th IAPR International Conference on, pp 661–663

  84. Seki A, Okutomi M (2006) Ego-motion estimation by matching dewarped road regions using stereo images. In Robotics and Automation, 2006 ICRA 2006 Proceedings 2006 I.E. International Conference on, pp 901–907

  85. Shafait F, Grimm M, Grigat R-R (2004) Low-complexity camera ego-motion estimation algorithm for real time applications, In Multitopic Conference, 2004 Proceedings of INMIC 2004. 8th International, pp 131–136

  86. Shakernia O, Ma Y, Koo TJ, Hespanha J, Sastry, SS (1999a) Vision guided landing of an unmanned air vehicle, In Decision and Control, 1999 Proceedings of the 38th IEEE Conference on, pp. 4143–4148.

  87. Shakernia O, Ma Y, Koo TJ, Sastry S (1999b) Landing an unmanned air vehicle: vision based motion estimation and nonlinear control. Asian Journal of Control 1:128–145

    Article  Google Scholar 

  88. Shulman D, Herve J-Y (1989) Regularization of discontinuous flow fields, in Visual Motion, 1989, Proceedings Workshop on, pp 81–86

  89. Singh KK, Fatahalian K, Efros AA (2016) KrishnaCam: using a longitudinal, single-person, egocentric dataset for scene understanding tasks. Chance 43:48.5

    Google Scholar 

  90. Sivaraman S, Trivedi MM (2011) Combining monocular and stereo-vision for real-time vehicle ranging and tracking on multilane highways, In Intelligent Transportation Systems (ITSC), 2011 14th International IEEE Conference on, pp 1249–1254

  91. Srinivasan N, Roberts R, Dellaert, F (2013) High frame rate egomotion estimation, In Computer Vision Systems, ed: Springer, pp 183–192

  92. Stein GP, Mano O (2004) System and method for estimating ego-motion of a moving vehicle using successive images recorded along the vehicle’s path of motion, ed: Google Patents

  93. Stein GP, Mano O, Shashua A (2000) A robust method for computing vehicle ego-motion, In Intelligent Vehicles Symposium, 2000 IV 2000 Proceedings of the IEEE, pp 362–368

  94. Stephens M, Blissett R, Charnley D, Sparks E, Pike J (1989) Outdoor vehicle navigation using passive 3D vision, In Computer Vision and Pattern Recognition, 1989. Proceedings CVPR’89., IEEE Computer Society Conference on, pp 556–562

  95. Strelow D, Mishler J, Koes D, Singh S (2001) Precise omnidirectional camera calibration. Computer Vision and Pattern Recognition, 2001 CVPR 2001 Proceedings of the 2001 I.E. Computer Society Conference on 1:I-689–I-694

    Google Scholar 

  96. Sun D, Roth S, Black MJ (2014) A quantitative analysis of current practices in optical flow estimation and the principles behind them. Int J Comput Vis 106:115–137

    Article  Google Scholar 

  97. Tian TY, Tomasi C, Heeger, DJ (1996) Comparison of approaches to egomotion computation, In Computer Vision and Pattern Recognition, 1996 Proceedings CVPR’96, 1996 I.E. Computer Society Conference on, pp 315–320

  98. Tomasi C, Shi J (1993) Direction of heading from image deformations, In Computer Vision and Pattern Recognition, 1993 Proceedings CVPR’93, 1993 I.E. Computer Society Conference on, pp 422–427

  99. Trucco E, Verri A (1998) Introductory techniques for 3-D computer vision, vol 201. Prentice Hall, Englewood Cliffs

    Google Scholar 

  100. Tsao A-T, Hung Y-P, Fuh C-S, Chen Y-S (1997) Ego-motion estimation using optical flow fields observed from multiple cameras, In Computer Vision and Pattern Recognition, 1997. Proceedings., 1997 I.E. Computer Society Conference on, pp 457–462

  101. Tsotsos K, Pretto A, Soatto, S (2012) Visual-inertial ego-motion estimation for humanoid platforms, In Humanoid Robots (Humanoids), 2012 12th IEEE-RAS International Conference on, pp 704–711

  102. van der Mark W, Fontijne D, Dorst L, Groen, FC (2002) Vehicle ego-motion estimation with geometric algebra, In Intelligent Vehicle Symposium, 2002 IEEE, pp 58–63

  103. Vassallo RF, Santos-Victor J, Schneebeli HJ (2002) A general approach for egomotion estimation with omnidirectional images, In Omnidirectional Vision, 2002 Proceedings Third Workshop on, pp 97–103

  104. Verri A, Poggio T (1989) Motion field and optical flow: qualitative properties. IEEE Trans Pattern Anal Mach Intell 11:490–498

    Article  Google Scholar 

  105. Wang H, Yuan K, Zou W, Zhou Q (2005) Visual odometry based on locally planar ground assumption, In Information Acquisition, 2005 I.E. International Conference on, p 6 pp

  106. Warren R (1976) The perception of egomotion. J Exp Psychol Hum Percept Perform 2:448

    Article  Google Scholar 

  107. Weickert J, Schnörr C (2001) A theoretical framework for convex regularizers in PDE-based computation of image motion. Int J Comput Vis 45:245–264

    Article  MATH  Google Scholar 

  108. Weishaupt A (2010) Tracking and Structure from Motion, Master’s Thesis, Signal Processing Laboratory – Section of Electrical Engineering School of Engineering Swiss Federal Institute of Technology. Lausanne, Switzerland

  109. Wu Y, Zhang Z, Huang TS, Lin JY (2001) Multibody grouping via orthogonal subspace decomposition. Computer Vision and Pattern Recognition, 2001 CVPR 2001 Proceedings of the 2001 I.E. Computer Society Conference on 2:II-252–II-257

    Google Scholar 

  110. Yamaguchi K, Kato T, Ninomiya Y (2006a) Moving obstacle detection using monocular vision, In Intelligent Vehicles Symposium, 2006 IEEE, pp 288–293

  111. Yamaguchi K, Kato T, Ninomiya Y (2006b) Vehicle ego-motion estimation and moving object detection using a monocular camera, In Pattern Recognition, 2006 ICPR 2006 18th International Conference on, pp 610–613

  112. Yamaguchi K, McAllester D, Urtasun R (2014) Efficient joint segmentation, occlusion labeling, stereo and flow estimation, In Computer Vision–ECCV 2014, ed: Springer, pp 756–771

  113. Yamamoto Y, Pirjanian P, Munich M, DiBernardo E, Goncalves L, Ostrowski J et al. (2005) Optical sensing for robot perception and localization, In Advanced Robotics and its Social Impacts, 2005 I.E. Workshop on, pp 14–17

  114. Yang M, Dong B, Wang H, Zhang, B (2002) Laser radar based real-time ego-motion estimation for intelligent vehicles, In Intelligent Vehicle Symposium, 2002. IEEE, pp 44–51

  115. Yang D, Sun F, Wang S, Zhang J (2014) Simultaneous estimation of ego-motion and vehicle distance by using a monocular camera,” Science China. Inf Sci 57:1–10

    Google Scholar 

  116. Zhuang X, Huang TS, Ahuja N, Haralick RM (1988) A simplified linear optic flow-motion algorithm. Computer Vision, Graphics, and Image Processing 42:334–344

    Article  Google Scholar 

Download references

Acknowledgments

We would like to thank, Dr. Saleem Gul, Institute of Management Sciences, Peshawar, Pakistan, for providing help in structuring the research article. We also appreciate the efforts of, Dr. Muhammad Haseeb, Department of Computer Science, University of Peshawar, Pakistan.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Naila Habib Khan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Khan, N.H., Adnan, A. Ego-motion estimation concepts, algorithms and challenges: an overview. Multimed Tools Appl 76, 16581–16603 (2017). https://doi.org/10.1007/s11042-016-3939-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-3939-4

Keywords

Navigation