Skip to main content

Part of the book series: Studies in Computational Intelligence ((SCI,volume 500))

  • 667 Accesses

Abstract

The interaction between a visual system and its environment is an important research topic of purposive vision, seeking to establish a link between perception and action. When a robotic system implements vision as its main source of information from the environment, it must be selective with the perceived data. In order to fulfill the task at hand we must contrive a way of extracting data from the images that will help to achieve the system’s goal; this selective process is what we call a visual behavior. In this paper, we present an automatic process for synthesizing visual behaviors through genetic programming, resulting in specialized prominent point detection algorithms to estimate the trajectory of a camera with a simultaneous localization and map building system. We present a real working system; the experiments were done with a robotic manipulator in a hand-eye configuration. The main idea of our work is to evolve a conspicuous point detector based on the concept of an artificial dorsal stream. We experimentally show that it is in fact possible to find conspicuous points in an image through a visual attention process, and that it is also possible to purposefully generate them through an evolutionary algorithm, seeking to solve a specific task.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aloimonos, J., Weiss, I., Bandyopadhyay, A.: Active vision. In: Proceedings of the First International Conference on Computer Vision, pp. 35–54 (1987)

    Google Scholar 

  2. Aloimonos, Y.: Active Perception, 292 pages. Lawrence Erlbaum Associates, Publishers (1993)

    Google Scholar 

  3. Ballard, D.: Animate Vision. Artificial Intelligence Journal 48, 57–86 (1991)

    Article  Google Scholar 

  4. Clemente, E., Olague, G., Dozal, L., Mancilla, M.: Object Recognition with an Optimized Ventral Stream Model using Genetic Programming. In: Di Chio, C., et al. (eds.) EvoApplications 2012. LNCS, vol. 7248, pp. 315–325. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  5. Davison, A.J.: Real-Time Simultaneous Localisation and Mapping with a Single Camera. In: Proceedings of the Ninth IEEE International Conference on Computer Vision, vol. 2, pp. 1403–1410. IEEE Computer Society, Washington, DC (2003)

    Chapter  Google Scholar 

  6. Dozal, L., Olague, G., Clemente, E., Sánchez, M.: Evolving Visual Attention Programs through EVO Features. In: Di Chio, C., et al. (eds.) EvoApplications 2012. LNCS, vol. 7248, pp. 326–335. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  7. Dunn, E., Olague, G.: Multi-objective sensor planning for efficient and accurate object reconstruction. In: Raidl, G.R., et al. (eds.) EvoWorkshops 2004. LNCS, vol. 3005, pp. 312–321. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  8. Dunn, E., Olague, G.: Pareto Optimal Camera Placement for Automated Visual Inspection. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3821–3826 (2005)

    Google Scholar 

  9. Fermüller, C., Aloimonos, Y.: The Synthesis of Vision and Action. In: Landy, et al. (eds.) Exploratory Vision: The Active Eye, ch. 9, pp. 205–240. Springer (1995)

    Google Scholar 

  10. Hernández, D., Olague, G., Clemente, E., Dozal, L.: Evolutionary Purposive or Behavioral Vision for Camera Trajectory Estimation. In: Di Chio, C., et al. (eds.) EvoApplications 2012. LNCS, vol. 7248, pp. 336–345. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  11. Itti, L., Koch, C.: Computational modelling of visual attention. Nature Review Neuroscience 2(3), 194–203 (2001)

    Article  Google Scholar 

  12. Koch, C., Ullman, S.: Shifts in selective visual attention: towards the underlying neural circuitry. Hum. Neurobiol. 4(4), 219–227 (1985)

    Google Scholar 

  13. Lepetit, V., Fua, P.: Monocular Model-Based 3D Tracking of Rigid Objects: A Survey. Foundations and Trends in Computer Graphics and Vision 1, 1–89 (2005)

    Article  Google Scholar 

  14. Olague, G., Trujillo, L.: Evolutionary-computer-assisted design of image operators that detect interest points using genetic programming. Image and Vision Computing 29(7), 484–498 (2011)

    Article  Google Scholar 

  15. Olague, G., Trujillo, L.: Interest Point Detection through Multiobjective Genetic Programming. Applied Soft Computing 12(8), 2566–2582 (2012)

    Article  Google Scholar 

  16. Olague, G.: Evolutionary Computer Vision – The First Footprints. Springer (to appear)

    Google Scholar 

  17. Shi, J., Tomasi, C.: Good features to track. In: Proceedings of Computer Vision and Pattern Recognition, pp. 593–600 (1994)

    Google Scholar 

  18. Treisman, A.M., Gelade, G.: A feature-integration theory of attention. Cognitive Psychology 12(1), 97–136 (1980)

    Article  Google Scholar 

  19. Trujillo, L., Olague, G.: Automated Design of Image Operators that Detect Interest Points. Evolutionary Computation 16(4), 483–507 (2008)

    Article  Google Scholar 

  20. Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: Improving the strength Pareto evolutionary algorithm. Technical report, Evolutionary Methods for Design (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniel E. Hernández .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Hernández, D.E., Olague, G., Clemente, E., Dozal, L. (2014). Optimizing a Conspicuous Point Detector for Camera Trajectory Estimation with Brain Programming. In: Schuetze, O., et al. EVOLVE - A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation III. Studies in Computational Intelligence, vol 500. Springer, Heidelberg. https://doi.org/10.1007/978-3-319-01460-9_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-01460-9_6

  • Publisher Name: Springer, Heidelberg

  • Print ISBN: 978-3-319-01459-3

  • Online ISBN: 978-3-319-01460-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics