Skip to main content
Log in

Neural network learning from demonstration and epipolar geometry for visual control of a nonholonomic mobile robot

  • Methodologies and Application
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

The control of a robot system using camera information is a challenging task regarding unpredictable conditions, such as feature point mismatch and changing scene illumination. This paper presents a solution for the visual control of a nonholonomic mobile robot in demanding real world circumstances based on machine learning techniques. A novel intelligent approach for mobile robots using neural networks (NNs), learning from demonstration (LfD) framework, and epipolar geometry between two views is proposed and evaluated in a series of experiments. A direct mapping from the image space to the actuator command is conducted using two phases. In an offline phase, NN–LfD approach is employed in order to relate the feature position in the image plane with the angular velocity for lateral motion correction. An online phase refers to a switching vision based scheme between the epipole based linear velocity controller and NN–LfD based angular velocity controller, which selection depends on the feature distance from the pre-defined interest area in the image. In total, 18 architectures and 6 learning algorithms are tested in order to find optimal solution for robot control. The best training outcomes for each learning algorithms are then employed in real time so as to discover optimal NN configuration for robot orientation correction. Experiments conducted on a nonholonomic mobile robot in a structured indoor environment confirm an excellent performance with respect to the system robustness and positioning accuracy in the desired location.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

References

  • Abbeel P, Coates A, Ng A (2010) Autonomous helicopter aerobatics through apprenticeship learning. Int J Robot Res 29:1608–1639

    Article  Google Scholar 

  • Argall BD, Chernova S, Veloso M, Browning B (2009) A survey of robot learning from demonstration. Robot Auton Syst 57:469–483

    Article  Google Scholar 

  • Basri R, Rivlin E, Shimshoni I (1999) Visual homing: surfing on the epipoles. Int J Comput Vis 33:117–137

    Article  Google Scholar 

  • Bay H, Ess A, Tuytelaars T, Van Gool L (2008) SURF: speeded up robust features. Comput Vis Image Underst 110:346–359

    Article  Google Scholar 

  • Becerra HM, López-Nicolás G, Sagüés C (2011) A sliding-mode-control law for mobile robots based on epipolar visual servoing from three views. IEEE Trans Robot 27:175–183

    Article  Google Scholar 

  • Benhimane S, Malis E, Rives P, Azinheira JR (2005) Vision-based control for car platooning using homography decomposition. In: Proceedings of the IEEE international conference on robotics and automation, pp 2173–2178

  • Bonin-Font F, Ortiz A, Oliver G (2008) Visual navigation for mobile robots: a survey. J Intell Robot Syst 53:263–296

    Article  Google Scholar 

  • Chaumette F, Hutchinson S (2006) Visual servo control part I: basic approaches. IEEE Robot Autom Mag 13:82–90

    Article  Google Scholar 

  • DeSouza GN, Kak AC (2002) Vision for mobile robot navigation: a survey. IEEE Trans Pattern Anal Mach Intell 24:237–267

    Article  Google Scholar 

  • Dillmann R, Kaiser M, Ude A (1995) Acquisition of elementary robot skills from human demonstration. In: International symposium on intelligent robotic systems (SIRS’95)

  • Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24:381–395

    Article  MathSciNet  Google Scholar 

  • Hartland C, Bredeche N (2007) Using echo state networks for robot navigation behavior acquisition. In: Proceedings of the IEEE international conference on robotics and biomimetics, pp 201–206

  • Hartley RI, Zisserman A (2004) Multiple view geometry in computer vision, 2nd edn. University Press, Cambridge

    Book  MATH  Google Scholar 

  • Hutchinson S, Hager G, Corke P (1996) A tutorial on visual servo control. IEEE Trans Robot Autom 12:651–670

    Article  Google Scholar 

  • Kasper M, Fricke G, Steuernagel K, Puttkamer EV (2001) A behavior-based mobile robot architecture for learning from demonstration. Robot Auton Syst 34:153–164

    Article  MATH  Google Scholar 

  • Lopes M, Santos-Victor J (2005) Visual learning by imitation with motor representations. IEEE Trans Syst Man Cybern Part B Cybern 35:438–449

    Article  Google Scholar 

  • López-Nicolás G, Guerrero JJ, Sagüés C (2010) Visual control of vehicles using two-view geometry. Mechatronics 20:315–325

    Article  Google Scholar 

  • López-Nicolás G, Sagüés C, Guerrero JJ (2007) Homography-based visual control of nonholonomic vehicles. In: Proceedings of the IEEE international conference on robotics and automation, pp 1703–1708

  • López-Nicolás G, Sagüés C, Guerrero JJ, Kragic D, Jensfelt P (2006) Nonholonomic epipolar visual servoing. In: Proceedings of the IEEE international conference on robotics and automation, pp 2378–2384

  • López-Nicolás G, Sagüés C, Guerrero JJ, Kragic D, Jensfelt P (2008) Switching visual control based on epipoles for mobile robots. Robot Auton Syst 56:592–603

    Article  Google Scholar 

  • Ma Y, Kosceka J, Sastry S (1999) Vision guided navigation for a nonholonomic mobile robot. IEEE Trans Robot Autom 15:521–537

    Article  Google Scholar 

  • Malis E, Chaumette F, Boudet S (1999) 2\(1/2\) D visual servoing. IEEE Trans Robot Autom 15:234–246

    Article  Google Scholar 

  • Mariottini GL, Oriolo G, Prattichizzo D (2007) Image-based visual servoing for nonholonomic mobile robots using epipolar geometry. IEEE Trans Robot 23:87–100

    Article  Google Scholar 

  • Miljković Z, Aleksendrić D (2009) Artificial neural networks—solved examples with theoretical background (in Serbian). University of Belgrade, Faculty of Mechanical Engineering, Belgrade

  • Miljković Z, Mitić M, Lazarević M, Babić B (2013a) Neural network reinforcement learning for visual control of robot manipulators. Expert Syst Appl 40(4):1721–1736

    Article  Google Scholar 

  • Miljković Z, Vuković N, Mitić M, Babić B (2013b) New hybrid vision-based control approach for automated guided vehicles. Int J Adv Manuf Technol 66(1–4):231–249

    Article  Google Scholar 

  • Narayanan KK, Posada LF, Hoffmann F, Bertram T (2009) Imitation learning for visual robotic behaviors. In: Proceedings of the 19th workshop, computational intelligence, pp 221–236

  • Sinha M, Kumar K, Kalra PK (2000) Some new neural network architectures with improved learning schemes. Soft Comput 4(4):214–223

    Article  MATH  Google Scholar 

  • Suleman MU, Awais MM (2011) Learning from demonstration in robots: experimental comparison of neural architectures. Robot Comput Integr Manuf 27:794–801

    Google Scholar 

  • Sweeney JD, Grupen RA (2007) A model of shared grasp affordances from demonstration. In: Proceedings of the IEEE-RAS international conference on humanoid robots, pp 27–35

  • Wai RJ, Liu CM, Lin YW (2011) Robust path tracking control of mobile robot via dynamic petri recurrent fuzzy neural network. Soft Comput 15(4):743–767

    Article  MATH  Google Scholar 

Download references

Acknowledgments

This work is supported by the Serbian Government—the Ministry of Education, Science and Technological Development—through the project TR35004 (2011–2014).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marko Mitić.

Additional information

Communicated by V. Piuri.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mitić, M., Miljković, Z. Neural network learning from demonstration and epipolar geometry for visual control of a nonholonomic mobile robot. Soft Comput 18, 1011–1025 (2014). https://doi.org/10.1007/s00500-013-1121-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-013-1121-8

Keywords

Navigation