Abstract
For collision-free autonomous navigation, pose estimation plays a pivotal role as it enables the robot to localize itself in the environment in which it is operating. A broad range of sensors are available for position and orientation estimation which are highly accurate and precise, but are contrariwise expensive. This work introduces a simple yet a robust method for estimating the robot pose via ArUco markers and particle filter. The proposed approach acquires position information from the tvec (translation vector) of the detected ArUco marker’s coordinate frame. Because of the flickering that incurs as a result of various factors in the detection of the ArUco markers, the measurement error and inaccuracy in establishing a feasible robot heading vector from the ArUco marker becomes more pronounced. Therefore, instead of acquiring the orientation information head-on from rvec, forward filtering-backward smoothing recursions are used for generating the heading vector (and consequently for spawing steering commands) based on the observations acquired from the camera. The Q matrix values are chosen considering anticipated process noise from the target’s position, speed, acceleration, heading angle, and turning rate. For the R matrix, values are selected based on the deviation in target states over time. Increasing the number of smoothing levels substantially reduces estimation errors. The smoothing filter proves to be crucial for correcting unexpected sensor information errors caused by environmental lighting conditions, system network data transfer lag, and unstable number of frames per second (fps). This study integrates the pure pursuit (PP) algorithm into the navigation framework for path following. Discretized PID control equations are employed to eliminate errors between desired and actual heading and speed of the robot. The system is simulated in a Gazebo environment and implemented on a 4-wheeled Ackermann drive mobile robot. Performance of the proposed method is evaluated using average speed, position, and heading errors. The findings showcase efficacy and robustness of the proposed method.














Similar content being viewed by others
Data Availability
Not applicable.
References
Andreu-Perez J, Deligianni F, Ravi D, Yang G-Z. Artificial intelligence and robotics. 2018. arXiv preprint arXiv:1803.10813.
Rubio F, Valero F, Llopis-Albert C. A review of mobile robots: concepts, methods, theoretical framework, and applications. Int J Adv Robot Syst. 2019;16:1729881419839596.
Liu L, et al. Computing systems for autonomous driving: state of the art and challenges. IEEE Internet Things J. 2020;8:6469–86.
Panchpor AA, Shue S, Conrad JM. A survey of methods for mobile robot localization and mapping in dynamic indoor environments. IEEE; 2018. pp. 138–144.
Tzafestas SG. Mobile robot control and navigation: a global overview. J Intell Robot Syst. 2018;91:35–58.
Roy P, Chowdhury C. A survey of machine learning techniques for indoor localization and navigation systems. J Intell Robot Syst. 2021;101:63.
Panigrahi PK, Bisoy SK. Localization strategies for autonomous mobile robots: a review. J King Saud Univ Comput Inf Sci. 2022;34:6019–39.
Gul F, Rahiman W, Nazli Alhady SS. A comprehensive study for robot navigation techniques. Cogent Eng. 2019;6:1632046.
Möller R, Furnari A, Battiato S, Härmä A, Farinella GM. A survey on human-aware robot navigation. Robot Autonom Syst. 2021;145: 103837.
Qin T, Pan J, Cao S, Shen S. A general optimization-based framework for local odometry estimation with multiple sensors. 2019. arXiv preprint arXiv:1901.03638.
Chuwei M, Ju H, Zhanyu Z. Localization and navigation method for omni-directional mobile robot based on odometry. IEEE; 2019. pp. 4697–4702.
Jaimez M, Monroy J, Lopez-Antequera M, Gonzalez-Jimenez J. Robust planar odometry based on symmetric range flow and multiscan alignment. IEEE Trans Robot. 2018;34:1623–35.
Mohamed SA, et al. A survey on odometry for autonomous navigation systems. IEEE Access. 2019;7:97466–86.
Jonnavithula N, Lyu Y, Zhang Z. Lidar odometry methodologies for autonomous driving: a survey. 2021. arXiv preprint arXiv:2109.06120.
Yang M, et al. Sensors and sensor fusion methodologies for indoor odometry: a review. Polymers. 2022;14:2019.
An J, Mou H, Lu R, Li Y. Localization and navigation analysis of mobile robot based on slam, vol. 1827. IOP Publishing; 2021. pp. 012089.
Kim P, Chen J, Kim J, Cho YK. Slam-driven intelligent autonomous mobile robot navigation for construction applications. Springer; 2018. pp. 254–269.
Mac TT, et al. Hybrid slam-based exploration of a mobile robot for 3D scenario reconstruction and autonomous navigation. Acta Polytech Hung. 2021;18:197–212.
Singandhupe A, La HM. A review of slam techniques and security in autonomous driving. IEEE; 2019. pp. 602–607.
Ahmed MF, Masood K, Fremont V. Active slam: a review on last decade. 2022. arXiv preprint arXiv:2212.11654.
Qu X, Soheilian B, Paparoditis N. Landmark based localization in urban environment. ISPRS J Photogramm Remote Sens. 2018;140:90–103.
Prasad A, Sharma B, Kumar SA. Strategic creation and placement of landmarks for robot navigation in a partially-known environment. IEEE; 2020. pp. 1–6.
Ramaithititima R, Bhattacharya S. Landmark-based exploration with swarm of resource constrained robots. IEEE; 2018. pp. 5034–5041.
Kato H, Billinghurst M. Marker tracking and HMD calibration for a video-based augmented reality conferencing system. IEEE; 1999. pp. 85–94.
Fiala M. ARTag, a fiducial marker system using digital techniques, vol. 2. IEEE; 2005. pp. 590–596.
Olson E. AprilTag: a robust and flexible visual fiducial system. IEEE; 2011. pp. 3400–3407.
Garrido-Jurado S, Muñoz-Salinas R, Madrid-Cuevas FJ, Marín-Jiménez MJ. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014;47:2280–92.
López de Ipina A, Mendonça PR, Hopper A, Hopper A. TRIP: a low-cost vision-based location system for ubiquitous computing. Pers Ubiquitous Comput. 2002;6:206–19.
Bergamasco F, Albarelli A, Rodola E, Torsello A. RUNE-Tag: a high accuracy fiducial marker with strong occlusion resilience. IEEE; 2011. pp. 113–120.
DeGol J, Bretl T, Hoiem D. ChromaTag: a colored marker and fast detection algorithm. 2017. pp. 1472–1481.
Calvet L, Gurdjos P, Griwodz C, Gasparini S. Detection and accurate localization of circular fiducials under highly challenging conditions. 2016. pp. 562–570.
Atcheson B, Heide F, Heidrich W. CALTag: high precision fiducial markers for camera calibration. 2010;10:41–48.
Zhang Z. A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell. 2000;22:1330–4.
Alam MS, Rafique MU. Mobile robot path planning in environments cluttered with non-convex obstacles using particle swarm optimization. IEEE; 2015. pp. 32–36.
Alam MS, Rafique MU, Kauser Z, Saleem M. Swarm intelligence based multi-objective path planning in environments cluttered with danger sources. IEEE; 2016. pp. 1–6.
Hart PE, Nilsson NJ, Raphael B. A formal basis for the heuristic determination of minimum cost paths. IEEE Trans Syst Sci Cybern. 1968;4:100–7.
Dijkstra EW. A note on two problems in connexion with graphs. 2022. pp. 287–290.
Coulter RC. Implementation of the pure pursuit path tracking algorithm. Tech. Rep. Carnegie-Mellon UNIV Pittsburgh PA Robotics INST. 1992.
Roth M, Hendeby G, Gustafsson F. EKF/UKF maneuvering target tracking using coordinated turn models with polar/Cartesian velocity. 2014. pp. 1–8.
Arulampalam M, Maskell S, Gordon N, Clapp T. A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans Signal Process. 2002;50:174–88.
Ristic B, Arulampalam S, Gordon N. Beyond the Kalman filter: particle filters for tracking applications. Norwood: Artech House; 2004.
Doucet A, Godsill S, Andrieu C. On sequential Monte Carlo sampling methods for Bayesian filtering. Stat Comput. 2000;10:197–208.
Doucet A, Johansen AM. A tutorial on particle filtering and smoothing: Fifteen years later. Tech. Rep. Department of Statistics, University of British Columbia. 2008.
Briers M, Doucet A, Maskell S. Smoothing algorithms for state–space models. Ann Inst Stat Math. 2009;62:61.
Funding
This study did not receive any funding.
Author information
Authors and Affiliations
Contributions
All the others contributed equally to this work.
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Research involving human and/or animals
Not applicable.
Informed consent
Not applicable.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Alam, M.S., Gullu, A.I. & Gunes, A. Fiducial Markers and Particle Filter Based Localization and Navigation Framework for an Autonomous Mobile Robot. SN COMPUT. SCI. 5, 748 (2024). https://doi.org/10.1007/s42979-024-03090-y
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s42979-024-03090-y