Skip to main content

Advertisement

Log in

On Natural Based Optimization

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Nature has always been a source of great inspiration for engineers and mathematicians. Evolutionary Algorithms are the latest in a line of natural-based innovations which have had a profound effect on the application of optimization in science and engineering. Although based on nature, Evolutionary Algorithms are nonetheless distinctly different from natural evolution in several areas. This paper outlines early and recent developments of Evolutionary Algorithms while covering those areas of difference. Practical issues related to the use of Evolutionary Algorithms, key parameters that affect the quality of the search and impact of user choices in problem formulation are also covered in this paper.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Notes

  1. A phenotype is a complete genetic package. In the biological world, phenotypes represent organism, for example humans.

  2. Non-genetic.

  3. From a finite domain \(\cal{X}\) to a finite domain \(\cal{Y}.\)

  4. The Building Block Hypothesis is classically referred to Goldberg. However, in here Holland is also referred to because in a recent paper [33], Goldberg refers to the building block hypothesis as Holland’s idea.

  5. As opposed to its ‘magnitude’ and ‘direction’.

  6. The first number is the number of parents, the second number is the number of children. The ‘+’ indicates that selection takes place between the entire population of parents and children, as opposed to ‘,’ which indicates selection among the children only.

  7. One of three, in this case.

  8. If two strings are the same, crossover simply exchanges similar information.

  9. Also referred to as elitist selection.

References

  1. Goldberg DE. Genetic algorithms in search, optimisation and machine learning. Harlow: Addidon-Wesley; 1989.

    Google Scholar 

  2. Beightler CS, Philips DT, Wilde DJ. Foundations of optimization. Englewood Cliffs: Prentice-Hall; 1979.

    Google Scholar 

  3. Schwefel HP. Numerical optimisation for computer models. Chichester: Wiley; 1981.

    Google Scholar 

  4. Bellman R. Adaptive control processes: a guided tour. Princeton: Princeton University Press; 1961.

    Google Scholar 

  5. Forsythe GE, Malcolm MA, Moler CB. Computer methods for mathematical computations. Englewood Cliffs: Prentice Hall; 1976.

    Google Scholar 

  6. Ingber LA. Simulated annealing: practice versus theory. J Math Comput Model. 1993;18(11):29–57.

    Article  Google Scholar 

  7. Battiti R, Tecchiolli G. Simulated annealing and tabu search in the long run: a comparison on qap tasks. Comput Math Appl. 1984;28(6):1–8.

    Article  Google Scholar 

  8. Vujcic VVK, Cangalovic MM. Tabu search methodology in global optimisation. Comput Math Appl. 1999;37(5):125–33.

    Google Scholar 

  9. Paulli J. Information utilization in simulated annealing and tabu search. COAL Bull. 1993;22:28–34.

    Google Scholar 

  10. Haykin S. Neural networks a comprehensive foundation. Englewood Cliffs: Prentice Hall; 1994.

  11. Fogel DB. An overview of evolutionary programming. In: Evolutionary algorithms. Berlin: Springer; 1999. p. 89–109.

  12. Lewontin RC. The genetic basis of evolutionary change. NY: Columbia University Press; 1974.

    Google Scholar 

  13. Atmar JW. On the rules and nature of simulated evolutionary programming. In: Proceedings of the first annual conference on evolutionary programming. 1992.

  14. Mayr E. The evolution of life. In: Tax S, Callender C, editors. Evolution after Darwin: issues in evolution, vol. 3. USA: University of Chicago Press; 1960.

  15. Hartl DL, Clark AG. Principles of population genetics. Sunderland: Sinauer; 1989.

    Google Scholar 

  16. Mayr E. Towards a new philosophy of biology: observations of an evolutionist. Cambridge: The Belknap Press of Harvard University Press; 1988.

    Google Scholar 

  17. Bremermann HJ. The evolution of intelligence. The nervous system as a model of its environment. Technical report, Department of Mathematics, University of Washington, Seattle, July; 1958.

  18. Holland JH. Adaptation in natural and artificial systems. Ann Arbor: The University of Michigan Press; 1975.

    Google Scholar 

  19. Reeves CR, Wright CC. Evolutionary algorithms, In: Genetic algorithms and the design of experiments. Berlin: Springer; 1999. p. 207–27.

  20. Vose MD. Evolutionary algorithms. In: What are genetic algorithms? A mathematical perspective. USA: Springer; 1999. p. 251–76.

  21. Wolpert DH, Macready WG. No free lunch theorem for optimization. IEEE Trans Evol Comput. 1997;1(1):67–82.

    Article  Google Scholar 

  22. Droste S, Jansen T, Wegener I. Perhaps not a free lunch but at least a free appetizer. Technical report ISSN 1433-3325, Reihe computational intelligence collaborative research center 531, University of Dortmund; 1998.

  23. Corne DW, Joshua DK. No free lunch and free leftovers theorems for multiobjective optimisation problems. 2003.

  24. Grefenstette JJ. Deception considered harmful. In: Whitley LD, editor. Foundations of genetic algorithms 2. 1993.

  25. Fogel DB, Ghozeil A. Schema processing under proportional selection in the presence of random effects. IEEE Trans Evol Comput. 1997;1(4).

  26. Poli R. Why the schema theorem is correct also in the presence of stochastic ffects. In: Proceedings of the 2002 congress on evolutionary computation, vol. 1. 2000. p. 487–92.

  27. Radcliff NJ. Schema processing. In: Back T, Fogel DB, Michalewicz Z, editors. Handbook of evolutionary computation. New York: Oxford University Press; 1997.

  28. Goldberg DE. An analysis of reproduction and crossover in a binary coded genetic algorithm. 1987.

  29. Vose MD, Liepins GE. Punctuated equilibria in genetic search. Complex Syst. 1991;5:31–44.

    Google Scholar 

  30. Nix EA, Vose MD. Modelling genetic algorithms using markov chains. Ann Math Artif Intell. 1992;4:79–88.

    Article  Google Scholar 

  31. Prugel-Bennet A, Shaprio JL. An analysis of genetic algorithms using statistical mechanics. Phys Rev Lett. 1992;9:1305–9.

    Google Scholar 

  32. Rogers A, Prügel-Bennett A. Modelling the dynamics of a steady state genetic algorithm. Proc Found Genet Algorithms. 1999;5:57–68.

    Google Scholar 

  33. Goldberg D. The race, the hurdle, and the sweet spot. In: Peter B, editor. Evolutionary design by computers. San Francisco: Morgan Kaufmann; 1999. p. 105–18.

  34. Goldberg DE. Genetic algorithms with walsh functions: part ii, deception and its analysis. Complex Syst. 1989;3.

  35. Whitley D. Fundamental principles of deception in genetic search. In: Rawlins GJE, editor. Foundations of genetic algorithms. San Francisco: Morgan Kaufmann; 1991. p. 221–41.

  36. Mitchell M, Holland JH, Forrest S. When will a genetic algorithm outperform hill climbing? In: Cowan JD, Tesauro G, Alspector J, editors. Advances in neural information processing systems 6. San Mateo: Morgan Kaufmann; 1994.

  37. Reeves CR, Wright CC. Epistasis in genetic algorithms: an experimental design perspective. In: Proceedings of the 6th international conference on genetic algorithms. 1995.

  38. Shaffer JD. Some experiments in machine learning using vector evaluated genetic algorithms. PhD thesis, Vanderbilt University, Nashville; 1984.

  39. Spears WM, De Jong KA. Analyzing gas suing markov models with semantically ordered and lumped states. Foundations of genetic algorithms 4. 1996.

  40. Rechenberg I. Cybernetic solution path of an experimental problem. Royal aircraft establishment, library translation no 1122, Farnsborough, Hants; 1965.

  41. Rechenberg I. Evolutionsstrategie: Optmierung technischer systeme nach Prinzipien der biologischen Evolution. Stuttgart: Frommann-Holzboog; 1973.

    Google Scholar 

  42. Schwefel HP. Evolutionsstrategie und Numerische Optimiernug. PhD thesis, Technische Universitat Berlin, May; 1975.

  43. Schwefel HP. Numerische Optimierung von Computer-Modellen mittels der Evolutionsstrategie, volume 26 of interdisciplinary systems research. Basel: Birkhauser; 1977.

  44. Radding C. Homologous pairing and strand exchange in genetic recombination. Annu Rev Genet. 1982;16:405–37.

    Google Scholar 

  45. Futuyma DJ. Evolutionary biology. Sunderland: Sinauer Associates; 1986.

    Google Scholar 

  46. Hansen N, Ostermeier A. Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In: Proceedings of the IEEE international conference on evolutionary computation. 1996. p. 312–7.

  47. Baker JE. Reducing bias and inefficiency in the selection algorithms. In: Proceedings of the 1st international conference on genetic algorithms. 1987.

  48. Le Riche R, Vayssade RT, Haftka RT. A segregated genetic algorithm for constrained optimisation in structural mechanics. Technical report, Univesite de Technologie de Compiegne, France; 1995.

  49. Michalewicz Z. Genetic algorithms + data structures = evolution programs. Berlin: Springer; 1994.

    Google Scholar 

  50. Richardson JT, Palmer MR, Liepins G, Hillard M. Some guidelines for genetic algorithms with penalty functions. In: Proceedings of the 3rd international conference on genetic algorithms. 1989.

  51. Homaifar A, Lai SHY, Qi X. Constrained optimisation via genetic algorithms. Simulation. 1994;62(4):242–354.

    Article  Google Scholar 

  52. Joines J, Houck C. On the use of non-stationary penalty functions to solve nonlinear constrained optimisation problems with GAs. In: David F, editor. Proceedings of the first IEEE conference on evolutionary computation. IEEE press, Piscataway. 1994. p. 579–84.

  53. Siedlecki W, Sklanski J. Constrained genetic optimisation via dynamic reward-penalty balancing and its use in pattern recognition In: Schaffer JD, editor. Proceedings of the third International conference on genetic algorithms. San Francisco: Morgan Kaufmann; 1989. p. 10–9.

  54. Michalewicz Z, Attia N. Evolutionary optimisation of constrained problems. In: Proceedings of the 3rd annual conference on evolutionary programming. 1994.

  55. Kirkpatric S, Gelatt CD, Vecchi MP. Optimisation by simulated annealing. Science. 1983;220: 671–80.

    Article  Google Scholar 

  56. Bean JC, Hadj-Alouane AB. A dual genetic algorithm for bounded integer programs. Technical report TR 92-53, Department of Industrial and Operations Engineering, The University of Michigan; 1992.

  57. Hadj-Alouane AB, Bean JC. A genetic algorithm for the multiple-choice integer program. Technical report TR 92-50, Department of Industrial and Operations Engineering, The University of Michigan; 1992.

  58. Smith AE, Tate DM. Genetic optimisation using a penalty function. In: Proceedings of the 5th international conference on genetic algorithms. 1993.

  59. Coit DW, Smith AE, Tate DM. Adaptive penalty methods for genetic optimisation of constrained combinatorial problems. INFORMS J Comput. 1996;8(2):173–82.

    Article  Google Scholar 

  60. Coit DW, Smith AE. A penalty guided genetic search for reliability design optimisation. Comput Ind Eng. 1996;30(4):895–904.

    Article  Google Scholar 

  61. Michalewicz Z, Schoenauer M. Evolutionary algorithms for constrained parameter optimisation problems. Evol Comput. 1996;4(1):1–32.

    Article  Google Scholar 

  62. Gen M, Cheng R. A survey of penalty techniques in genetic algorithms. In: Proceedings of the 1996 international conference on evolutionary computation. 1996.

  63. Ebiden AE, Van der Hauw JK. Adaptive Penalties for Evolutionary graph-coloring, pages No. 1363 in LNCS. In: Hao JK, Lutton E, Ronald E, Schoenauer M, Snyers D, editors. Artificial evolution. Berlin: Springer; 1997. p. 95–106.

  64. Coello CA. Use of a self-adaptive penalty approach for engineering optimisation problems. Comput Ind. 1999.

  65. Coello CA. Self-adaptive penalties for ga-based optimisation. In: Proceedings of the 1999 congress on evolutionary computation. 1999.

  66. Deb K. An efficient constraint handling method for genetic algorithms. Comput Methods Appl Mech Eng. 1999.

  67. Whitley D, Gordon VS, Mathias K. Lamarckian evolution, the baldwin effect and function optimisation. In: Proceedings of the parallel problem solving from Nature 3. 1994.

  68. Porter B, Jones AH. Genetic tuning of digital PID controllers. Electron Lett. 1992;28:843–4.

    Article  Google Scholar 

  69. Porter B, Borairi M. Genetic design of linear multivariable feedback control systems using eigenstructure assignment. Int J Syst Sci. 1992;23:1387–90.

    Article  Google Scholar 

  70. Porter B, Mohamed SS. Genetic design of minimum-time controllers. Electron Lett. 1993;29:1897–8.

    Article  Google Scholar 

  71. Porter B. Genetic design of control systems. Trans Soc Instrum Control Eng. 1995;34(5):393–402.

    Google Scholar 

  72. Booker L. Improving search in genetic algorithms. In: Davis L, editor. Genetic algorithms and simulated annealing. San Francisco: Morgan Kaufmann; 1987. p. 61–73.

  73. Spears WM, De Jong KA. On the Virtues of parameterised uniform crossover. In: Belew RK, Booker LB, editors. Proceedings of the fourth international conference on genetic algorithms. San Francisco: Morgan Kaufmann; 1991. p. 230–6.

  74. Syswerda G. Uniform crossover in genetic algorithms. In: Schaffer JD, editor. Proceedings of the third international conference on genetic algorithms. San Francisco: Morgan Kaufmann; 1989. p. 2–9.

  75. Eshelman LJ, Caruna R, Schaffer JD. Biases in the crossover landscape In: Schaffer JD, editor. Proceedings of the third international conference on genetic algorithms. San Francisco: Morgan Kaufmann; 1989. p. 10–9.

  76. De Jong K. The analysis and behaviour of a class of genetic adaptive systems. PhD thesis, University of Michigan, Ann Arbor; 1975.

  77. Spears WM, De Jong KA. In: Rawlins GJE, editor. Foundations of genetic algorithms. 1991. p. 301–15.

  78. Mühlenbein H, Schlierkamp-Voosen D. Predictive models for the breeder genetic algorithm: I. continuous parameter optimization. Evol Comput. 1993;1(1):25–49.

    Article  Google Scholar 

  79. Muhlenbein H. The breeder genetic algorithm—a provable optimal search algorithm and its application. In: Proceedings of the IEE colloquium on applications of genetic algorithms, London; 1994.

  80. Voigt H-M, Born J, Santibanez-Koref I. Modeling and simulation of distributed evolutionary search processes for function optimization. In: Parallel problem solving from nature, volume 496 of lecture notes in computer science, chapter . New York: Springer; 1991. p. 373–80.

  81. Goldberg DE, Deb K. In: Rawlins GJE, editor. Foundations of genetic algorithms. 1991. p. 69–93.

  82. Brindle A. Genetic algorithms for function optimization. PhD thesis, University of Alberta; 1981.

  83. Grefenstette JJ. Gnesis: a system for using genetic search procedures. In: Proceedings of the 1984 conference on intelligent systems and machines. 1984. p. 161–5

  84. Grefenstette JJ, Baker JE. How genetic algorithms work: a critical look at implicit parallelism. In: Schaffer JD, editor. Proceedings of the third international conference on genetic algorithms. San Francisco: Morgan Kaufmann; 1989. p. 20–7.

  85. Davis L. Adapting operator probabilities in genetic algorithms. In: Schaffer JD, editor. Proceedings of the third international conference on genetic algorithms. San Francisco: Morgan Kaufmann; 1989. p. 61–9.

  86. Whitely D. The GENITOR algorithm and selection pressure: why rank based allocation of reproductive trials is best. In: Schaffer JD, editor. Proceedings of the third international conference on genetic algorithms. San Francisco: Morgan Kaufmann; 1989. p. 116–23.

  87. Hinterding R, Michalewicz Z, Eiben AE. Adaptation in evolutionary computation: a survey. In: Proceedings of the 4th IEEE conference on evolutionary computation, Piscataway; 1997. p. 65–9.

  88. Eiben AE, Hinterding R, Michalewicz Z. Parameter control in evolutionary algorithms. IEEE Trans Evol Comput. 1999;3(2):124–41.

    Article  Google Scholar 

  89. Fogarty TC. Varying the probability of mutation in the genetic algorithm. In: Proceedings of the 3rd international conference on genetic algorithms. 1991. p. 104–9.

  90. Hesser J, Manner R. Towards an optimal mutation probability or genetic algorithms, chapter. In: Schwefel HP, Manner R, editors. Proceedings of the 1st conference on parallel problem solving from nature. Number 496 lecture notes in computer science. Berlin: Springer; 1990. p. 23–32.

  91. Davis L. Job shop scheduling with genetic algorithms. In: Proceedings of the 1st international conference on genetic algorithms. 1985. p. 136–40.

  92. Ackley DH. Chapter 13: an empirical study of bit vector function optimization. In: Genetic algorithms and simulated annealing. Pitman; 1987. p. 170–204.

  93. Bramletter MF. Initialization mutation and selection methods in genetic algorithms for function optimization. In: Proceedings of the 4th international conference on genetic algorithms. 1991. p. 100–7.

  94. Michalewicz Z, Janikow CZ. Handling constraints in genetic algorithms. In: Proceedings of the 4th international conference on genetic algorithms. 1991. p. 151–7.

  95. Smith J, Fogarty TC. Self-adaptation of mutation rates in a steady state genetic algorithm. In: Proceedings of the 3rd IEEE conference on evolutionary computation. 1996. p. 318–23.

  96. Fogel DB, Fogel LJ, Atmar JW. Meta-evolutionary programming approach. In: Proceedings of the 25th Aslimar conference on signals, systems, and computers. 1991. p. 540–5.

  97. Back T. Self-adaptation in genetic algorithms. In: Proceedings of the 1st European conference on artificial life. 1992.

  98. Back T, Hammel U, Schwefel HP. Evolutionary computation: comments on the history and current state. IEEE Trans Evol Comput. 1997;1(1):3–17.

    Article  Google Scholar 

  99. Ostermeier A, Gawelczyk A, Hansen N. Step-size adaptation based on non-local use of selection information. PPSN3. 1994. p. 189–98.

  100. Hansen N, Ostemeier A. Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In: Proceedings of the 1996 IEEE international conference on evolutionary computation. 1996. p. 312–7.

  101. Back T. Evolutionaryalgorithms in theory and practice. New York: Oxford University Press; 1996.

    Google Scholar 

  102. Yao X, Liu Y. Fast evolutionary programming. In: Fogel LJ, Angeline PJ, Back T, editors. Proceedings of the 5th annual conference on evolutionary programming. Cambridge: MIT Press; 1996. p. 451–60.

  103. Schwefel HP. Problems of constancy and change—the complementarity of systems approaches to complexity In: Checkland P, Kiss I, editors. 31st Annual meeting of the international society for general system research, vol. 2. International Society for General System Research. 1987. p. 1025–33.

  104. Hesser J, Manner R. Self-adaptation. In: Back T, Fogel D, Michalewicz Z, editors. Handbook of evolutionary computation. New York: Oxford University Press; 1997. p. C7.1:1–1:15.

  105. Beyer HG. Toward a theory of evolution strategies: self-adaptation. Evol Comput. 1996;3(3):311–47.

    Article  Google Scholar 

  106. Auger A. Convergence results for (1,λ)-sa-es using the theory of φ-irreducible markov chains. Theor Comput Sci. 2005;334:35–69.

    Article  Google Scholar 

  107. Deb K, Beyer HG. Self-adaptive genetic algorithms with simulated binary crossover. Evol Comput. 2001;9(2):197–221.

    Article  CAS  PubMed  Google Scholar 

  108. Lin L, Delgado-Frias JG, Gause DC, Vassiliadis S. Hybrid newton-raphson genetic algorithm for the traveling salesman problem. J Cybern. 1995;26(4):378–412.

    Article  Google Scholar 

  109. Fujita K, Akagi S, Hirokawa N. Hybrid approach for optimal nesting using a genetic algorithm and a local minimization algorithm. In: Proceedings of the ASME design automation conference. 2003. p. 477–84.

  110. Li WD, Ong SK, Nee AYC. Hybrid genetic algorithm and simulated annealing approach for the optimization of process plans for prismatic parts. Int J Prod Res. 2002;24:1899–922.

    Article  Google Scholar 

  111. Mahfoud SW, Goldberg DE. Parallel recombinative simulated annealing: a genetic algorithm. Parallel Comput. 1995;21(1):1–28.

    Article  Google Scholar 

  112. Thangiah SR, Osman IH, Sun T. Hybrid genetic algorithms, simulated annealing and tabu search methods for vehicle routing problems with time windows. Technical report UKC/OR94/4, Institue of Mathematic and Statistics, University of Kent, Canterbury; 1994.

  113. Glover F, Kelly JP, Laguna M. Genetic algorithms and tabu search: hybrids for optimization. Comput Oper Res. 1995;22(1):111–34.

    Article  Google Scholar 

  114. Shi XH, Liang YC, Lee HP, Lu C, Wang LM. An improved ga and a novel pso-ga-based hybrid algorithm. Inf Process Lett. 2005;93(5):255–61.

    Article  Google Scholar 

  115. Kao YT, Zahara E. A hybrid genetic algorithm and particle swarm optimization for multi-modal functions. Appl Soft Comput. 2008;8(2):849–57.

    Article  Google Scholar 

  116. Wang L. A hybrid genetic algorithm—neural network strategy for simulation optimization. Appl Math Comput. 2005;170(2):1329–43.

    Article  Google Scholar 

  117. Huang K, Zhan XL, Chen FQ, Lu DW. Catalyst design for methane oxidative coupling by using artificial neural network and hybrid genetic algorithm. Chem Eng Sci. 2003;58(1):81–7.

    Article  CAS  Google Scholar 

  118. Cordón O, Gomide F, Herrera F, Hoffmann F, Magdalena L. Ten years of genetic fuzzy systems: current framework and new trends. Fuzzy Sets Syst. 2004;141(1):5–31.

    Article  Google Scholar 

  119. Storn R, Price K. A simple and efficient heuristic for global optimization over continuous spaces. J Global Optim. 1997;11:341–59.

    Article  Google Scholar 

  120. Lampinen J, Zelinka I. On stagnation of the differential evolution algorithm. In: Proceedings of MENDEL. 2000.

  121. Storn R. Designing digital filters with differential evolution. In: Corne D, Dorigo M, Glover F, editors. New ideas in optimization. London: McGraw-Hill; 1999. p. 109–25.

  122. Press WH, Teukolsky SA, Vetterling WT, Flannery BP. Numerical recipes in C. Cambridge: Cambridge University Press; 1992.

    Google Scholar 

  123. Babu BV, Angira R. A differential evolution approach for global optimization of MINLP problems. In: Proceedings of 4th Asia Pacific conference on simulated evolution and learning. 2002.

  124. Cardoso MF, Salcedo RL, Azevedo SF, Barbosa D. A simulated annealing approach to the solution of MINLP problems. Comput Chem Eng. 1997;21(12):1349–64.

    Article  CAS  Google Scholar 

  125. Lin YC, Hwang KS, Wang FS. Co-evolutionary hybrid differential evolution for mixed-integer optimization problems. Eng Opt. 2001;00:1–20.

    Google Scholar 

  126. Gupta K. Branch and bound experiments in nonlinear integer programming. PhD thesis, Purdue University; 1980.

  127. Loh HT. A Sequential linearisation approach for mixed-discrete nonlinear design optimization. PhD thesis, University of Michigan, Department of Mechanical Engineering and Applied Mechanics; 1989.

  128. Zhang C, Wang HP. Mixed-discrete nonlinear optimization with simulated annealing. Eng Optim. 1993;21:277–91.

    Article  Google Scholar 

  129. Lampinen J, Zelinka I. Mixed variable non-linear optimization by differential evolution. In: Proceedings of Nostradamus. 1999.

  130. Sandgren E. Nonlinear integer and discrete programming in mechanical design optimization. Trans ASME J Mech Des. 1990;112:223–9.

    Article  Google Scholar 

  131. Lopez Cruz IL, Van Willigenburg LG, Van Straten G. Efficient differential evolution algorithms for multimodal optimal control problems. Appl Soft Comput. 2003;3:97–122.

    Article  Google Scholar 

  132. Chipperfield A, Fleming P. Multiobjective gas turbine engine controller design using genetic algorithms. IEEE Trans Ind Electron. 1996;43(5):1–5.

    Article  Google Scholar 

  133. Horn J, Nafpliotis N. Multiobjective optimisation using the niched pareto genetic algorithm. Technical report IlliGAL Report 93005, Illinois Genetic Algorithms Lab, University of Illinois; 1993.

  134. Booker L. Improving the performance of genetic algorithms in classifier systems. In: Proceedings of the 1st international conference on genetic algorithms. 1985. p. 80–92.

  135. Davidor Y. A naturally occurring niche and species phenomenon: the model and first results. In: Proceedings of the 4th international conference on genetic algorithms. 1991. p. 257–63

  136. Davis L, Coombs S Genetic algorithms and communication link speed design: theoretical considerations. In: Proceedings of the 2nd international conference on genetic algorithms. 1987. p. 252–6.

  137. Davidor Y. Epistasis variance: suitability of a representation to genetic algorithms. Complex Syst. 1990;4:369–83.

    Google Scholar 

  138. Beasley D, Bull DR, Martin RR. Reducing epistasis in combinatorial problems by expansive coding. In: Proceedings of the 5th international conference on genetic algorithms. 1993. p. 400–7.

  139. Davis L. Applying adaptive algorithms to epistasis domains. In: 9th International joint conference on artificial intelligence. 1985. p. 162–4.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Amin Nobakhti.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nobakhti, A. On Natural Based Optimization. Cogn Comput 2, 97–119 (2010). https://doi.org/10.1007/s12559-010-9039-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-010-9039-2

Keywords

Navigation