Skip to main content
Log in

An immune-inspired instance selection mechanism for supervised classification

  • Regular Research Paper
  • Published:
Memetic Computing Aims and scope Submit manuscript

Abstract

One issue in data classification problems is to find an optimal subset of instances to train a classifier. Training sets that represent well the characteristics of each class have better chances to build a successful predictor. There are cases where data are redundant or take large amounts of computing time in the learning process. To overcome this issue, instance selection techniques have been proposed. These techniques remove examples from the data set so that classifiers are built faster and, in some cases, with better accuracy. Some of these techniques are based on nearest neighbors, ordered removal, random sampling and evolutionary methods. The weaknesses of these methods generally involve lack of accuracy, overfitting, lack of robustness when the data set size increases and high complexity. This work proposes a simple and fast immune-inspired suppressive algorithm for instance selection, called SeleSup. According to self-regulation mechanisms, those cells unable to neutralize danger tend to disappear from the organism. Therefore, by analogy, data not relevant to the learning of a classifier are eliminated from the training process. The proposed method was compared with three important instance selection algorithms on a number of data sets. The experiments showed that our mechanism substantially reduces the data set size and is accurate and robust, specially on larger data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. A Asuncion DN (2010) UCI Machine learning repository. http://www.ics.uci.edu/~mlearn/MLRepository.html

  2. Abbas AK, Lichtman AH, Pober JS (1991) Cellular and molecular immunology. Saunders, Philadelphia

    Google Scholar 

  3. Aha DW, Kibbler D, Albert MK (1991) Instance-based learning algorithms. Mach Learn 6: 37–66

    Google Scholar 

  4. Blum AL, Langley P (1997) Selection of relevant features and examples in machine learning. Artif Intell 97: 245–271

    Article  MathSciNet  MATH  Google Scholar 

  5. Broadley CE (1993) Addressing the selective superiority problem: automatic algorithm/model class selection. In: Procedings of 10th international machine learning conference, pp 17–24

  6. Cano J, Herrera F, Lozano M (2003) Using evolutionary algorithms as instance selection for data reduction in KDD: An experimental study. IEEE Trans Evol Comput 7(6): 561–575

    Article  Google Scholar 

  7. Cano JR, Herrera F, Lozano M (2006) On the combination of evolutionary algorithms and stratified strategies for training set selection in data mining. Appl Soft Comput 6(3): 323–332

    Article  Google Scholar 

  8. Cormack DH (2001) Essential histology, 2nd edn. Lippincott Williams and Wilkins

  9. Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines (and other kernel-based learning methods). Cambridge University Press, Cambridge

    Google Scholar 

  10. Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30. http://dl.acm.org/citation.cfm?id=1248547.1248548

    Google Scholar 

  11. Devijver P, Kittler J (1980) On the edited nearest neighbour rule. IEEE Patt Recognit 1: 72–80

    Google Scholar 

  12. Eick C, Zeidat N, Vilalta R (2004) Using representative-based clustering for nearest neighbor dataset editing. In: Data mining, 2004. ICDM ’04. Fourth IEEE international conference, pp 375–378

  13. Eshelman LJ (1991) The CHC adaptive search algorithm: how to have safe search when engaging in nontraditional genetic recombination. In: Rawlins GJE (ed) Proceedings of the first workshop on foundations of genetic algorithms. Morgan Kaufmann, San Mateo, pp 265–283

  14. Espíndola RP, Ebecken NFF (2005) On extending f-measure and g-mean metrics to multi-class problems. In: Sixth international conference on data mining, text mining and their business applications, Wessex Institute of Technology, UK, vol 35. WIT Press, Skiathos, pp 25–34

  15. Franco A, Maltoni D, Nanni L (2010) Data pre-processing through reward-punishment editing. Pattern Anal Appl (PAA) 13:367–381(15)

    Google Scholar 

  16. García S, Cano JR, Herrera F (2010) Intelligent systems for automated learning and adaptation: emerging trends and applications, IGI Global, chap A review on evolutionary prototype selection: an empirical study of performance and efficiency, pp 92–113

  17. García S, Cano JR, Herrera F (2008) A memetic algorithm for evolutionary prototype selection: a scaling up approach. Pattern Recognit 41:2693–2709. doi:10.1016/j.patcog.2008.02.006, http://dl.acm.org/citation.cfm?id=1367147.1367320

    Google Scholar 

  18. Garfield E (1979) Citation indexing—its theory and application in science, technology, and humanities/Eugene Garfield. Wiley, New York

    Google Scholar 

  19. Gates GW (1972) The reduced nearest neighbor rule. IEEE Trans Inform Theory 14: 431–433

    Article  Google Scholar 

  20. Goldberg D (1989) Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading

    MATH  Google Scholar 

  21. Hart PE (1968) The condensed nearest neighbor rule. IEEE Trans Inform Theory 14: 515–516

    Article  Google Scholar 

  22. Holland JH (1975) Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor

    Google Scholar 

  23. Janeway CA, Travers P, Walport M, Shlomchik M (2001) Immunobiology: the immune system in health and disease, 5th edn. Garland Science, Oxford

    Google Scholar 

  24. John GH, Kohavi R, Pfleger K (1994) Irrelevant features and the subset selection problem. In: International conference on machine learning. Morgan Kaufmann, San Mateo, pp 121–129

  25. Kibbler D, Aha DW (1987) Learning representative exemplars of concepts: an initial case of study. In: Proceedings of the 4th international workshop on machine learning, pp 24–30

  26. Kohavi R (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. In: IJCAI, pp 1137–1145

  27. Kohonen T (1990) The self-organizing map. In: IEEE international conference on neural networks, vol 78, pp 1464–1480

  28. Lowe DG (1995) Similarity metric learning for a variable-kernel classifier. Neural Comput 7(1): 72–85

    Article  Google Scholar 

  29. Nanni L, Lumini A (2009) Particle swarm optimization for prototype reduction. Neurocomputing 72(4–6):1092–1097 (Brain Inspired Cognitive Systems (BICS 2006) / Interplay Between Natural and Artificial Computation (IWINAC 2007))

    Google Scholar 

  30. Paredes R, Vidal E (2004) Learning prototypes and distances (LPD): a prototype reduction technique based on nearest neighbor error minimization. Int Conf Pattern Recognit 3: 442–445

    Google Scholar 

  31. Pedreira CE (2006) Learning vector quantization with training data selection. IEEE Trans Pattern Anal Mach Intell 28: 157–162

    Article  Google Scholar 

  32. Quinlan JR (1993) C4.5: Programs for machine learning. Morgan Kaufmann, San Mateo

    Google Scholar 

  33. Rice JA (2006) Mathematical statistics and data analysis. Duxbury Press, Pacific Grove

    Google Scholar 

  34. Skalak DB (1994) Prototype and feature selection by sampling and random mutation hill climbing algorithms. In: Proceedings of the 11th international conference on machine learning. Morgan Kaufmann, San Mateo

  35. Tizard I (1985) Introduction to veterinary immunology, 2nd edn. ROCA, La Roca Del Valles-barcel (in Portuguese)

  36. Whitley D (1989) The genitor algorithm and selective preasure: Why rank based allocation of reproductive trials is best. In: Proceedings of the 3rd international conference on genetic algorithms, pp 116–121

  37. Wilcoxon F (1945) Individual comparisons by ranking methods. Biometrics Bull 1(6):80–83. doi:10.2307/3001968

    Google Scholar 

  38. Wilson DL (1972) Asymptotic properties of nearest neighbor rules using edited data. Syst Man Cybern IEEE Trans 2(3): 408–421

    Article  MATH  Google Scholar 

  39. Wilson DR, Martinez TR (1997) Instance pruning techniques. In: Proceedings of the 14th international conference on machine learning, pp 403–411

  40. Wilson DR, Martinez TR (2000) Reduction techniques for instance-based learning algorithms. Mach Learn 38: 257–268

    Article  MATH  Google Scholar 

  41. Yin XC, Liu CP, Han Z (2005) Feature combination using boosting. Pattern Recognit Lett 26: 2195–2205

    Article  Google Scholar 

  42. Zeidat N, Wang S, Eick CF (2005) Dataset editing techniques: a comparative study

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Grazziela P. Figueredo.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Figueredo, G.P., Ebecken, N.F.F., Augusto, D.A. et al. An immune-inspired instance selection mechanism for supervised classification. Memetic Comp. 4, 135–147 (2012). https://doi.org/10.1007/s12293-012-0081-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12293-012-0081-3

Keywords

Navigation