Skip to main content

Learning the “next” dimension

  • Conference paper
  • First Online:
Evolutionary Computing (AISB EC 1996)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1143))

Included in the following conference series:

Abstract

In this paper we develop a novel search framework for optimization of functions over continuous domains based upon the building block hypothesis. We test one particular heuristic defined in terms of our framework (i.e. assumption 2, section 2) on a number of test functions and it exhibits promising performance. Since our heuristic is deterministic it is relatively easy to design a test function for which it fails. However, the search framework is general enough to define various other heuristics. Moreover, experience from search methods developed in the field of AI can be easily tailored in our search framework, since it basically represents a search in a tree structure. An important question to be addressed in future research is how to exploit the problem structure in order to define appropriate heuristics in the proposed search framework.

Another possible line for future research could be the utilisation of our search framework as a decision support tool that would interactively assist in the global optimization process.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Goldberg D. E., Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, Reading, MA, 1989

    Google Scholar 

  2. D. Wolpert, and W. Macready, No Free Lunch Theorems for Search, Santa Fe Institute Technical Report SFI-TR-95-02-010, 1995

    Google Scholar 

  3. N. Radcliffe, and P. Surry, Fundamental Limitations on Search Algorithms: Evolutionary Computing in Perspective, in Lecture Notes in Computer Science 1000, Springer-Verlag, 1995

    Google Scholar 

  4. Torn A., and Zilinskas A., Global Optimization, Lecture Notes in Computer Science 350, Springer-Verlag, 1988

    Google Scholar 

  5. Ratschek H., and Rokne J., New Computer Methods for Global Optimization, Ellis Horwood Ltd., 1988

    Google Scholar 

  6. Press W., Teukolsky S., Vetterling W., and Flannery B, Numerical Recipes in C, Cambridge Univ. Press, 1992, p. 402

    Google Scholar 

  7. D. Yuret, From Genetic Algorithms to Efficient Optimization, MSc thesis, MIT May 1994

    Google Scholar 

  8. First International Contest on Evolutionary Optimization, http://iridia.ulb.ac.be/langerman/ICEO.html

    Google Scholar 

  9. Some Hard Global Optimization Test Problems, http://solon.cma.univie.ac.at/∼neum/glopt/my_poblems.html

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Terence C. Fogarty

Rights and permissions

Reprints and permissions

Copyright information

© 1996 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bilchev, G., Parmee, I.C. (1996). Learning the “next” dimension. In: Fogarty, T.C. (eds) Evolutionary Computing. AISB EC 1996. Lecture Notes in Computer Science, vol 1143. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0032781

Download citation

  • DOI: https://doi.org/10.1007/BFb0032781

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-61749-5

  • Online ISBN: 978-3-540-70671-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics