Skip to main content
Log in

Intelligent Automatic Object Tracking Method by Integration of Laser Scanner System and INS

  • Published:
Programming and Computer Software Aims and scope Submit manuscript

Abstract

In the last years Autonomous vehicle navigation has had considerable improvements; however, this technology has not yet been able to effectively be integrated with mobile systems that work inside buildings, houses or without the influence of absolute references such as GPS. It is for this reason that this paper provides a methodological proposal that allows the integration of the use of object detection methods such as YOLO (You Only Look Once) or R-CNN (Region Convolutional Neural Networks), with laser scanning systems for the measurement of spatial coordinates and an INS system to obtain spatial positioning through relative references. This method integrates the principle of dynamic triangulation and the methodological proposal IKZ (Inertial Navigation with Kalman and Zero Update) that allow to obtain acceptable mobile autonomy times in which the sector of interest is re-scanned.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.

Similar content being viewed by others

REFERENCES

  1. Goldhoorn, A., Garrell, A., Alquézar, R., and Sanfeliu, A., Searching and tracking people with cooperative mobile robots, Auton. Rob., 2018, vol. 42, no. 4, pp. 739–759.

    Article  Google Scholar 

  2. Wunderlich, S., Schmölz, J., and Kühnlenz, K., Follow me: a simple approach for person identification and tracking, Proc. 26th IEEE Int. Symp. on Industrial Electronics (ISIE), Edinburgh, 2017, pp. 1609–1614.

  3. Ge, Y. and Li, W., Human following of mobile robot with a low-cost laser scanner, Proc. IEEE Int. Conf. on Systems, Man and Cybernetics (SMC), Bari, 2019, pp. 3987–3992.

  4. Abir, I.R., Shanim, I.S., and Ahmed, N., MINION: a following robot using ultrasonic wave, in Progress in Advanced Computing and Intelligent Engineering, Singapore: Springer, 2018, pp. 387–395.

    Google Scholar 

  5. Popov, V.L., Ahmed, S.A., Shakev, N.G., and Topalov, A.V., Detection and following of moving targets by an indoor mobile robot using Microsoft Kinect and 2D lidar data, Proc. 15th IEEE Int. Conf. on Control, Automation, Robotics and Vision (ICARCV), 2018, pp. 280–285.

  6. Arseev, S., Konushin, A., and Liutov, V., Human recognition by appearance and gait, Program. Comput. Software, 2018, vol. 44, no. 4, pp. 258–265.

    Article  Google Scholar 

  7. Chen, B.X., Sahdev, R., and Tsotsos, J.K., Integrating stereo vision with a CNN tracker for a personfollowing robot, Proc. Int. Conf. on Computer Vision Systems, Cham: Springer, 2017, pp. 300–313.

  8. Gupta, M., Kumar, S., Behera, L., and Subramanian, V.K., A novel vision-based tracking algorithm for a humanfollowing mobile robot, IEEE Trans. Syst., Man, Cybernet.: Syst., 2016, vol. 47, no. 7, pp. 1415–1427.

    Article  Google Scholar 

  9. Choi, J., Jin Chang, H., Yun, S., Fischer, T., Demiris, Y., and Young Choi, J., Attentional correlation filter network for adaptive visual tracking, Proc. IEEE Conf. on Computer Vision and Pattern Recognition, Honolulu, 2017, pp. 4807–4816.

  10. Park, E. and Berg, A.C., Meta-tracker: fast and robust online adaptation for visual object trackers, Proc. European Conf. on Computer Vision (ECCV), Munich, 2018, pp. 569–585.

  11. Li, W., Wei, W., Qiang, H., and Shi, M., Collaborating visual tracker based on particle filter and correlation filter, Concurrency Comput.: Pract. Exper., 2019, vol. 31, no. 12, p. e4665.

    Google Scholar 

  12. Li, P., Wang, D., Wang, L., and Lu, H., Deep visual tracking: review and experimental comparison, Pattern Recogn., 2018, vol. 76, pp. 323–338.

    Article  Google Scholar 

  13. Pang, L., Cao, Z., Yu, J., Guan, P., Chen, X., and Zhang, W., A robust visual person-following approach for mobile robots in disturbing environments, IEEE Syst. J., 2019, vol. 14, no. 2.

  14. Trujillo-Hernández, G., Rodríguez-Quiñonez, J.C., Ramírez-Hernández, L.R., Castro-Toscano, M.J., Hernández-Balbuena, D., Flores-Fuentes, W., and Mercorelli, P., Accuracy improvement by artificial neural networks in technical vision system, Proc. 45th Annu. Conf. of the IEEE Industrial Electronics Society IECON 2019, Lisbon, 2019, vol. 1, pp. 5572–5577.

  15. Rohan, A., Rabah, M., and Kim, S.H., Convolutional neural network-based real-time object detection and tracking for parrot AR Drone 2, IEEE Access, 2019, vol. 99, no. 1.

  16. Zhdanov, A.D., Zhdanov, D.D., Bogdanov, N.N., Potemin, I.S., Galaktionov, V.A., and Sorokin, M.I., Discomfort of visual perception in virtual and mixed reality systems, Program. Comput. Software, 2019, vol. 45, no. 4, pp. 147–155.

    Article  Google Scholar 

  17. Kim, S., Park, S., Na, B., and Yoon, S., Spiking-YOLO: spiking neural network for energy-efficient object detection, Proc. 34th AAAI Conf. on Artificial Intelligence AAAI 2020, New York, 2020.

  18. Girshick, R., Fast r-cnn, Proc. IEEE Int. Conf. on Computer Vision, Santiago, 2015, pp. 1440–1448.

  19. Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.Y., and Berg, A.C., Ssd: Single shot multibox detector, in Proc. European Conf. on Computer Vision, Cham: Springer, 2016, pp. 21–37.

  20. Redmon, J. and Farhadi, A., Yolov3: an incremental improvement, 2018. arXiv:1804.02767.

  21. Han, J., Zhang, D., Cheng, G., Liu, N., and Xu, D., Advanced deep-learning techniques for salient and categoryspecific object detection: a survey, IEEE Signal Proces. Mag., 2018, vol. 35, no. 1, pp. 84–100.

    Article  Google Scholar 

  22. Islam, M.M., Lam, A., Fukuda, H., Kobayashi, Y., and Kuno, Y., An intelligent shopping support robot: understanding shopping behavior from 2D skeleton data using GRU network, ROBOMECH J., 2019, vol. 6, no. 1, p. 18.

    Article  Google Scholar 

  23. Cliff, O.M., Saunders, D.L., and Fitch, R., Robotic ecology: tracking small dynamic animals with an autonomous aerial vehicle, Sci. Rob., 2018, vol. 3, no. 23, p. eaat8409.

  24. Jean, J.H. and Lian, F.L., Robust visual servo control of a mobile robot for object tracking using shape parameters, IEEE Trans. Control Syst. Technol., 2011, vol. 20, no. 6, pp. 1461–1472.

    Article  Google Scholar 

  25. Zhang, M., Liu, X., Xu, D., Cao, Z., and Yu, J., Vision-based target-following guider for mobile robot, IEEE Trans. Ind. Electron., 2019, vol. 99, no. 1.

  26. Castro-Toscano, M.J., Rodríguez-Quiñonez, J.C., Hernández-Balbuena, D., Rivas-Lopez, M., Sergiyenko, O., and Flores-Fuentes, W., Obtención de Trayectorias Empleando el Marco Strapdown INS/KF: Propuesta Metodológica, Revista Iberoamericana de Automática e Informática industrial, 2018, vol. 15, no. 4, pp. 391–403.

    Article  Google Scholar 

  27. Lindner, L., Sergiyenko, O., Rodríguez-Quiñonez, J.C., Tyrsa, V., Mercorelli, P., Fuentes, W.F., .and Nieto-Hipolito, J.I., Continuous 3D scanning mode using servomotors instead of stepping motors in dynamic laser triangulation, Proc. IEEE 24th Int. Symp. on Industrial Electronics (ISIE), Rio de Janeiro, 2015, pp. 944–949.

  28. Real-Moreno, O., Castro-Toscano, M.J., Rodrízuez-Ouiñonez, J.C., Hernández-Balbuena, D., Flores-Fuentes, W., and Rivas-Lopez, M., Implementing k-nearest neighbor algorithm on scanning aperture for accuracy improvement, Proc. 44th Annu. Conf. of the IEEE Industrial Electronics Society IECON 2018, Washington, 2018, pp. 3182–3186.

  29. Rodriguez-Quinonez, J.C., Sergiyenko, O., Gonzalez-Navarro, F.F., Basaca-Preciado, L., and Tyrsa, V., Surface recognition improvement in 3D medical laser scanner using Levenberg-Marquardt method, Signal Processing, 2013, vol. 93, no. 2, pp. 378–386.

    Article  Google Scholar 

  30. Zhang, X., Liu, P., and Zhang, C., An integration method of inertial navigation system and three-beam lidar for the precision landing, Math. Probl. Eng., 2016, vol. 2016, art. ID 4892376.

    Google Scholar 

  31. Reyes-García, M., Sergiyenko, O., Ivanov, M., Lindner, L., Rodríguez-Quiñonez, J.C., Hernandez-Balbuena, D., .and Murrieta-Rico, F.N., Defining the final angular position of DC motor shaft using a trapezoidal trajectory profile, Proc. IEEE 28th Int. Symp. on Industrial Electronics (ISIE), Vancouver, 2019, pp. 1694–1699.

  32. Garcia-Cruz, X.M., Sergiyenko, O.Y., Tyrsa, V., Rivas-Lopez, M., Hernandez-Balbuena, D., Rodriguez-Quiñonez, J.C., and Mercorelli, P., Optimization of 3D laser scanning speed by use of combined variable step, Opt. Lasers Eng., 2014, vol. 54, pp. 141–151.

    Article  Google Scholar 

  33. Básaca-Preciado, L.C., Sergiyenko, O.Y., Rodríguez-Quinonez, J.C., García, X., Tyrsa, V. V., Rivas-Lopez, M., and Tabakova, I., Optical 3D laser measurement system for navigation of autonomous mobile robot, Opt. Lasers Eng., 2014, vol. 54, pp. 159–169.

    Article  Google Scholar 

  34. Ivanov, M., Lindner, L., Sergiyenko, O., Rodríguez-Quiñonez, J.C., Flores-Fuentes, W., and Rivas-Lopez, M., Mobile robot path planning using continuous laser scanning, in Optoelectronics in Machine Vision-Based Theories and Applications, IGI Global, 2019, pp. 338–372.

    Google Scholar 

  35. Ivanov, M., Sergiyenko, O., Tyrsa, V., Lindner, L., Rodriguez-Quiñonez, J.C., Flores-Fuentes, W., and Hipólito, J.N., Software advances using n-agents wireless communication integration for optimization of surrounding recognition and robotic group dead reckoning, Program. Comput. Software, 2019, vol. 45, no. 8, pp. 557–569.

    Article  Google Scholar 

Download references

ACKNOWLEDGMENTS

The work is partially supported by the Autonomous University of Baja California.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J. C. Rodríguez-Quiñonez.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rodríguez-Quiñonez, J.C. Intelligent Automatic Object Tracking Method by Integration of Laser Scanner System and INS. Program Comput Soft 46, 619–625 (2020). https://doi.org/10.1134/S0361768820080186

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0361768820080186

Navigation