Skip to main content

Evolutionary Neuroestimation of Fitness Functions

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2902))

Abstract

One of the most influential factors in the quality of the solutions found by an evolutionary algorithm is the appropriateness of the fitness function. Specifically in data mining, in where the extraction of useful information is a main task, when databases have a great amount of examples, fitness functions are very time consuming. In this sense, an approximation to fitness values can be beneficial for reducing its associated computational cost. In this paper, we present the Neural–Evolutionary Model (NEM), which uses a neural network as a fitness function estimator. The neural network is trained through the evolutionary process and used progressively to estimate the fitness values, what enhances the search efficiency while alleviating the computational overload of the fitness function. We demonstrate that the NEM is faster than the traditional evolutionary algorithm, under some assumptions over the total amount of estimations carried out by the neural network. The Neural–Evolutionary Model proves then useful when datasets contain vast amount of examples.

The research was supported by the Spanish Research Agency CICYT under grant TIC2001–1143–C03–02.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aguilar–Ruiz, J.S., Riquelme, J.C., Del Valle, C.: Improving the evolutionary coding for machine learning tasks. In: Proceedings of the 15th European Conference on Artificial Intelligence (ECAI 2002), Lyon, France, August 2002, pp. 173–177 (2002)

    Google Scholar 

  2. Aguilar–Ruiz, J.S., Riquelme, J.C., Toro, M.: Data set editing by ordered projection. Intelligent Data Analysis 5(5), 1–13 (2001)

    Google Scholar 

  3. Aguilar–Ruiz, J.S., Riquelme, J.C., Del Valle, C.: Evolutionary learning of hierarchical decision rules. IEEE Transactions on Systems, Man and Cybernetics, Part B 33(2), 324–331 (2003)

    Article  Google Scholar 

  4. John, G.H., Langley, P.: Static versus dynamic sampling for data mining. In: Proceedings of the Second International Conference on Knowledge Discovery and Data Mining (1996)

    Google Scholar 

  5. Dasarathy, B.V.: Nearest Neighbor(NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)

    Google Scholar 

  6. Fix, E., Hodges, J.L.: Discriminatory analysis, nonparametric discrimation consistency properties. Technical Report 4, US Air Force, School of Aviation Medicine, Randolph Field, TX (1951)

    Google Scholar 

  7. Wilson, D.R., Martinez, T.R.: Reduction techniques for instance–based learning algorithms. Machine Learning 38(3), 257–286 (2000)

    Article  MATH  Google Scholar 

  8. Kononenko, I.: Estimating attributes: analysis and extensions of relief. In: Proceedings of European Conference on Machine Learning, Springer, Heidelberg (1994)

    Google Scholar 

  9. Yao, X., Liu, Y.: A new evolutionary system for evolving artificial neural networks. IEEE Transactions on Neural Networks 8, 694–7130 (1997)

    Article  Google Scholar 

  10. Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)

    Article  Google Scholar 

  11. Pipe, A.G., Fogarty, T.C., Winfield, A.: Balancing exploration with exploitation - solving mazes with real numbered search spaces. In: Proceedings of the First IEEE Conference on Evolutionary Computation, pp. 458–489 (1994)

    Google Scholar 

  12. Coit, D.W., Smith, A.E.: Solving the redundancy allocation problem using a combined neural network/genetic algorithm approach. Computers & Operations Research (1995)

    Google Scholar 

  13. Blake, C., Merz, E.K.: UCI repository of machine learning databases (1998)

    Google Scholar 

  14. Golub, T.R., Slonim, D.K., Tamayo, P., Huard, C., Gaasenbeek, M., Mesirov, J.P., Coller, H., Loh, M.L., Downing, J.R., Caligiuri, M.A., Bloomfield, C.D., Lander, E.S.: Molecular classification of cancer: Class discovery and class prediction by gene expression monitoring. Science (285), 531–537 (1999)

    Article  Google Scholar 

  15. Angeline, P., Pollack, J.: Evolutionary module acquisition. In: Second Annual Conference on Evolutionary Programming (1993)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Aguilar-Ruiz, J.S., Mateos, D., Rodriguez, D.S. (2003). Evolutionary Neuroestimation of Fitness Functions. In: Pires, F.M., Abreu, S. (eds) Progress in Artificial Intelligence. EPIA 2003. Lecture Notes in Computer Science(), vol 2902. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24580-3_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-24580-3_15

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-20589-0

  • Online ISBN: 978-3-540-24580-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics