Skip to main content

Incorporating Knowledge in Evolutionary Prototype Selection

  • Conference paper
Intelligent Data Engineering and Automated Learning – IDEAL 2006 (IDEAL 2006)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4224))

  • 1635 Accesses

Abstract

Evolutionary algorithms has been recently used for prototype selection showing good results. An important problem in prototype selection consist in increasing the size of data sets. This problem can be harmful in evolutionary algorithms by deteriorating the convergence and increasing the time complexity. In this paper, we offer a preliminary proposal to solve these drawbacks. We propose an evolutionary algorithm that incorporates knowledge about the prototype selection problem. This study includes a comparison between our proposal and other evolutionary and non-evolutionary prototype selection algorithms. The results show that incorporating knowledge improves the performance of evolutionary algorithms and considerably reduces time execution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Transactions on Information Theory 13, 21–27 (1967)

    Article  MATH  Google Scholar 

  • Liu, H., Motoda, H.: On issues of instance selection. Data Min. Knowl. Discov. 6, 115–130 (2002)

    Article  MathSciNet  Google Scholar 

  • Wilson, D.R., Martinez, T.R.: Reduction techniques for instance-based learning algorithms. Machine Learning 38, 257–286 (2000)

    Article  MATH  Google Scholar 

  • Grochowski, M., Jankowski, N.: Comparison of instance selection algorithms II. Results and comments. In: ICAISC, pp. 580–585 (2004)

    Google Scholar 

  • Ishibuchi, H., Nakashima, T.: Evolution of reference sets in nearest neighbor classification. In: McKay, B., Yao, X., Newton, C.S., Kim, J.-H., Furuhashi, T. (eds.) SEAL 1998. LNCS (LNAI), vol. 1585, pp. 82–89. Springer, Heidelberg (1999)

    Chapter  Google Scholar 

  • Cano, J.R., Herrera, F., Lozano, M.: Using evolutionary algorithms as instance selection for data reduction in KDD: An experimental study. IEEE Transactions on Evolutionary Computation 7, 561–575 (2003)

    Article  Google Scholar 

  • Cano, J.R., Herrera, F., Lozano, M.: Stratification for scaling up evolutionary prototype selection. Pattern Recogn. Lett. 26, 953–963 (2005)

    Article  Google Scholar 

  • Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. Springer, Heidelberg (2003)

    MATH  Google Scholar 

  • Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading (1989)

    MATH  Google Scholar 

  • Eshelman, L.J.: The CHC adaptative search algorithm: How to safe search when engaging in nontraditional genetic recombination. In: FOGA, pp. 265–283 (1990)

    Google Scholar 

  • Baluja, S.: Population-based incremental learning: A method for integrating genetic search based function optimization and competitive learning. Technical report, Pittsburgh, PA, USA (1994)

    Google Scholar 

  • Newman, D.J., Hettich, S., Merz, C.B.: UCI repository of machine learning databases (1998)

    Google Scholar 

  • Wilson, D.L.: Asymptotic properties of nearest neighbor rules using edited data. IEEE Transactions on Systems, Man and Cybernetics 2, 408–421 (1972)

    Article  MATH  Google Scholar 

  • Hart, P.E.: The condensed nearest neighbour rule. IEEE Transactions on Information Theory 18, 515–516 (1968)

    Article  Google Scholar 

  • Gates, G.W.: The reduced nearest neighbour rule. IEEE Transactions on Information Theory 18, 431–433 (1972)

    Article  Google Scholar 

  • Aha, D.W., Kibler, D., Albert, M.K.: Instance-based learning algorithms. Machine Learning 7, 37–66 (1991)

    Google Scholar 

  • Skalak, D.B.: Prototype and feature selection by sampling and random mutation hill climbing algorithms. In: ICML, pp. 293–301 (1994)

    Google Scholar 

  • Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    Google Scholar 

  • Wilcoxon, F.: Individual comparisons by rankings methods. Biometrics 1, 80–83 (1945)

    Article  Google Scholar 

  • Sheskin, D.J.: Handbook of Parametric and Nonparametric Statistical Procedures. CRC Press, Boca Raton (1997)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

García, S., Cano, J.R., Herrera, F. (2006). Incorporating Knowledge in Evolutionary Prototype Selection. In: Corchado, E., Yin, H., Botti, V., Fyfe, C. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2006. IDEAL 2006. Lecture Notes in Computer Science, vol 4224. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11875581_161

Download citation

  • DOI: https://doi.org/10.1007/11875581_161

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-45485-4

  • Online ISBN: 978-3-540-45487-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics