Skip to main content

A Fitness Estimation Strategy for Genetic Algorithms

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2358))

Abstract

Genetic Algorithms (GAs) are a popular and robust strategy for optimisation problems. However, these algorithms often require huge computation power for solving real problems and are often criticized for their slow operation. For most applications, the bottleneck of the GAs is the fitness evaluation task. This paper introduces a fitness estimation strategy (FES) for genetic algorithms that does not evaluate all new individuals, thus operating faster. A fitness and associated reliability value are assigned to each new individual that is only evaluated using the true fitness function if the reliability value is below some threshold. Moreover, applying some random evaluation and error compensation strategies to the FES further enhances the performance of the algorithm. Simulation results show that for six optimization functions, the GA with FES requires fewer evaluations while obtaining similar solutions to those found using a traditional genetic algorithm. For these same functions the algorithm generally also finds a better fitness value on average for the same number of evaluations. Additionally the GA with FES does not have the side effect of premature convergence of the population. It climbs faster in the initial stages of the evolution process without becoming trapped in the local minima.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Holland J., “Adaptation in Natural and Artificial Systems”, MIT Press, Cambridge, MA, 1975.

    Google Scholar 

  2. Goldberg D.E., “Genetic Algorithms in Search, Optimization and Machine Learning”, Addison-Wesley Co, 1989.

    Google Scholar 

  3. Vanderplaats, G.N., “Numerical Optimization Techniques for Engineering Design: with Applications”, McGraw-Hill, New York, 1984.

    MATH  Google Scholar 

  4. Powell M.J.D., “A Direct Search Optimization Method that Models the Objective and Constraint Functions by Linear Interpolation” Proceedings of the Sixth Workshop on Optimisation and Numerical Analysis, Oaxaca, Mexico, Volume 275, pp. 51–67, Kluwer Academic Publishers, 1994.

    MathSciNet  Google Scholar 

  5. Ratle, A., “Optimal Sampling Strategies for Learning a Fitness Model”, Proceedings of the 1999 Congress on Evolutionary Computation, Vol. 3, pp. 2078–85, Piscataway, NJ, 1999, IEEE Press.

    Article  Google Scholar 

  6. Back T., Fogel D.B., Michalewicz Z., “Handbook of Evolutionary Computation”, Oxford University Press, Oxford, 1997.

    Google Scholar 

  7. Fitzpatrick J.M., Grefenstette J.J., “Genetic Algorithm in Noisy Environment”, Machine Learning, Vol 3, pp. 101–20, 1988.

    Google Scholar 

  8. Hammel U., Bäck T., “Evolution Strategies on Noisy Functions: How to Improve Convergence Properties”, Parallel Problem Solving from Nature (PPSN III), Lecture notes in computer science 866, Springer-Verlag, Berlin, pp. 159–68, 1994.

    Google Scholar 

  9. Grefenstette J.J., “Robot Learning with Parallel Genetic Algorithms on Network Computers”, The Proceedings of the 1995 Summer Computer Simulation Conference (SCSC95), Ottawa, CA, pp. 352–57, 1995.

    Google Scholar 

  10. Baker J.E., “Reducing Bias and Inefficiency in the Selection Algorithm”, Proceedings of the Second International Conference on Genetic Algorithms, Lawrence Erlbaum Associates, Hillsdale, NJ, 1987.

    Google Scholar 

  11. Marin J. and Sole R.V., “Macroevolutionary Algorithms: A New Optimization Method on Fitness Landscape”, IEEE Transaction on Evolutionary Computation, Vol. 3, No. 4, pp. 272–286, 1999.

    Article  Google Scholar 

  12. Eiben A.E. and Bäck T., “An Empirical Investigation of Multi-Parent Recombination Operators in Evolution Strategies, Evolutionary Computation, Vol. 5, No. 3, pp. 347–365, 1997.

    Article  Google Scholar 

  13. Smith R., Dike B., and Stegmann S., “Fitness Inheritance in Genetic Algorithms”, Proceedings of the ACM Symposium on Applied Computing, pp. 345–350, New York, NY, 1995.

    Google Scholar 

  14. Sastry K., Goldberg D.E., and Pelikan M., “Don’t Evaluate, Inherit”, IlliGAL Technical Report No. 2001013, University of Illinois at Urbana-Champaign, Urbana, Illinois, January 2001.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Salami, M., Hendtlass, T. (2002). A Fitness Estimation Strategy for Genetic Algorithms. In: Hendtlass, T., Ali, M. (eds) Developments in Applied Artificial Intelligence. IEA/AIE 2002. Lecture Notes in Computer Science(), vol 2358. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48035-8_49

Download citation

  • DOI: https://doi.org/10.1007/3-540-48035-8_49

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43781-9

  • Online ISBN: 978-3-540-48035-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics