Skip to main content

Advertisement

Log in

Automated feature selection in neuroevolution

  • Research Paper
  • Published:
Evolutionary Intelligence Aims and scope Submit manuscript

Abstract

Feature selection is a task of great importance. Many feature selection methods have been proposed, and can be divided generally into two groups based on their dependence on the learning algorithm/classifier. Recently, a feature selection method that selects features at the same time as it evolves neural networks that use those features as inputs called Feature Selective NeuroEvolution of Augmenting Topologies (FS-NEAT) was proposed by Whiteson et al. In this paper, a novel feature selection method called Feature Deselective NeuroEvolution of Augmenting Topologies (FD-NEAT) is presented. FD-NEAT begins with fully connected inputs in its networks, and drops irrelevant or redundant inputs as evolution progresses. Herein, the performances of FD-NEAT, FS-NEAT and traditional NEAT are compared in some mathematical problems, and in a challenging race car simulator domain (RARS). On the whole, the results show that FD-NEAT significantly outperforms FS-NEAT in terms of network performance and feature selection, and evolves networks that offer the best compromise between network size and performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Zongker D, Jain A (1996) Algorithms for feature selection: an evaluation. In: Proceedings of the 13th international conference on pattern recognition. Vienna, Austria, pp 18–22

  2. Kittler J (1978) Feature set search algorithms. In: Chen CH (ed) Pattern recognition and signal processing. Sijthoff and Noordhoff, Alphen aan den Rijn, Netherlands, pp 41–60

    Google Scholar 

  3. Mao KZ (2002) Fast orthogonal forward selection algorithm for feature subset selection. IEEE Trans Neural Netw 13:1218–1224

    Article  Google Scholar 

  4. Narendra PM, Fukunaga K (1977) A branch and bound algorithm for feature subset selection. IEEE Trans Comput c-26:917–922

    Article  Google Scholar 

  5. Ferri FJ, Pudil P, Hatef M, Kittler J (1994) Comparative study of techniques for large-scale feature selection. In: Gelsema ES, Kanal LN (eds) Pattern recognition in practice IV. Elsevier Science B.V., Amsterdam, pp 403–413

    Google Scholar 

  6. Pudil P, Novovicova J, Kittler J (1994) Floating search methods in feature selection. Pattern Recognit Lett 15:1119–1125

    Article  Google Scholar 

  7. Jain A, Zongker D (1997) Feature selection: evaluation, application, and small sample performance. IEEE Trans Pattern Anal Machine Intell 19:153–157

    Article  Google Scholar 

  8. Kudo M, Sklansky J (2000) Comparison of algorithms that select features for pattern classifiers. Pattern Recognit 33:25–41

    Article  Google Scholar 

  9. Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97:273–324

    Article  MATH  Google Scholar 

  10. Langley P (1994) Selection of relevant features in machine learning. In: Proceedings of the AAAI fall symposium on relevance. AAAI Press, New Orleans

  11. Bonnlander BV, Weigend AS (1994) Selecting input variables using mutual information and nonparametric density estimation. In: Proceedings of the 1994 international symposium on artificial neural networks (ISANN’94). Tainan, Taiwan, pp 42–50

  12. Yang J, Honavar V (1998) Feature subset selection using a genetic algorithm. IEEE Intell Syst 13:44–49

    Article  Google Scholar 

  13. Whiteson S, Stanley KO, Miikkulainen R (2004) Automatic feature selection in neuroevolution. In: Proceedings of the genetic and evolutionary computation conference (GECCO), Seattle, Washington, USA

  14. Whiteson S, Stone P, Stanley KO, Miikkulainen R, Kohl N (2005) Automatic feature selection in neuroevolution. In: Proceedings of the genetic and evolutionary computation conference (GECCO), Washington, DC, USA

  15. Stanley KO, Miikkulainen R (2002) Evolving neural networks through augmenting topologies. Evol Comput 10:99–127

    Article  Google Scholar 

  16. Stanley KO, Miikkulainen R (2004) Competitive coevolution through evolutionary complexification. J Artif Intell Res 21:63–100

    Google Scholar 

  17. Langley P, Sage S (1994) Oblivious decision trees and abstract cases. In: Working notes of the AAAI-94 workshop on case-based reasoning. AAAI Press, Seattle, pp 113–117

  18. Kelly JDK, Davis L (1991) Hybridizing the genetic algorithm and the k nearest neighbors classification algorithm. In: Belew RK, Booker LB (eds) Proceedings of the 4th international conference on genetic algorithms. Morgan Kaufmann, San Diego, pp 377–383

    Google Scholar 

  19. Timin ME (1995) The robot auto racing simulator. Available via http://rars.sourceforge.net

  20. Gomez F, Miikkulainen R (1998) 2-D pole balancing with recurrent evolutionary networks. In: Proceedings of the international conference on artificial neural networks (ICANN-98), Skovde, Sweden. Elsevier, New York

  21. Gomez F, Miikkulainen R (1997) Incremental evolution of complex general behavior. Adapt Behav 5:317–342

    Article  Google Scholar 

  22. Gruau F, Whitley D, Pyeatt L (1996) A comparison between cellular encoding and direct encoding for genetic neural networks. In: Koza JR, Goldberg DE, Fogel DB, Riolo, RL (eds) Proceedings of the first annual conference on genetic programming, Cambridge, MA, pp 81–89

  23. Stanley KO (2004) Efficient evolution of neural networks through complexification. PhD thesis, The University of Texas at Austin

  24. Harvey I (1992) Species adaptation genetic algorithms: a basis for a continuing SAGA. In: Varela FJ, Bourgine P (eds) Proceedings of the 1st European conference on artificial life, toward a practice of autonomous systems. MIT Press/Bradford Books, Cambridge, pp 346–354

    Google Scholar 

  25. Cliff D, Harvey I, Husbands P (1992) Incremental evolution of neural network architectures for adaptive behaviour. Technical report CSRP256, School of Cognitive and Computing Sciences, University of Sussex, UK

  26. Gomez FJ, Miikkulainen R (1999) Solving non-markovian control tasks with neuroevolution. In: Proceedings of the international joint conference on artificial intelligence. Stockholm, Sweden. Morgan Kaufmann, Denver

  27. Saravanan N, Fogel DB (1995) Evolving neural control systems. IEEE Expert 10(3):23–27

    Google Scholar 

  28. Wieland AP (1991) Evolving neural network controllers for unstable systems. In: Proceedings of the international joint conference on neural networks, Seattle, WA. Piscataway, New Jersey, pp 667–673

  29. Yao X (1999) Evolving artificial neural networks. Proc IEEE 87:1423–1447

    Article  Google Scholar 

  30. Moriarty DE, Miikkulainen R (1996) Efficient reinforcement learning through symbiotic evolution. Machine Learn 22:11–32

    Google Scholar 

  31. Radcliffe NJ (1993) Genetic set recombination and its application to neural network topology optimization. Neural Comput Appl 1:67–90

    Article  MATH  Google Scholar 

  32. Dasgupta D, McGregor D (1992) Designing application-specific neural networks using the structured genetic algorithm. In: Proceedings of the international conference on combinations of genetic algorithms and neural networks. IEEE Computer Society Press, USA, pp 87–96

  33. Pujol JCF, Poli R (1997) Evolution of the topology and the weights of neural networks using genetic programming with a dual representation. Technical report CSRP-97-7, School of Computer Science, The University of Birmingham, Birmingham B15 2TT, UK

  34. Gruau F (1993) Genetic synthesis of modular neural networks. In: S. Forrest (ed) Proceedings of the fifth international conference on genetic algorithms. Morgan Kaufmann, San Mateo, CA, pp 318–325

  35. Angeline PJ, Saunders GM, Pollack JB (1993) An evolutionary algorithm that constructs recurrent neural networks. IEEE Trans Neural Netw 5:54–65

    Article  Google Scholar 

  36. Wieland A (1991) Evolving neural network controllers for unstable systems. In: Proceedings of the international joint conference on neural networks. Piscataway, New Jersey, pp 667–673

  37. Stanley KO, Bryant BD, Miikkulainen R (2005) Real-time neuroevolution in the NERO video game. IEEE Trans Evol Comput 9:653–668

    Article  Google Scholar 

  38. Stanley KO, Miikkulainen R (2004) Evolving a roving eye for go. In: Proceedings of the genetic and evolutionary computation conference (GECCO). Springer, New York, pp 1226–1238

  39. Schlessinger E, Bentley PJ, Lotto RB (2005) Analysing the evolvability of neural network agents through structural mutations. In: Capcarrere M (ed) Proceedings of the European conference on artificial life (ECAL 2005). Springer, Berlin, pp 312–321

  40. Yao X, Liu Y (1996) Towards designing artificial neural networks by evolution. Appl Math Comput 91:83–90

    Article  Google Scholar 

  41. Kohl N, Stanley K, Miikkulainen R, Samples M, Sherony R (2006) Evolving a real-world vehicle warning system. In: Proceedings of the genetic and evolutionary computation conference (GECCO), Seattle, Washington, USA, pp 1681–1688

  42. Gomez F, Miikkulainen R (1998) 2-D pole balancing with recurrent evolutionary networks. In: Proceedings of the international conference on artificial neural networks (ICANN). Skovde, Sweden. Elsevier, New York

  43. Goldberg DE, Richardson J (1987) Genetic algorithms with sharing for multimodal function optimization. In: Proceedings of the second international conference on genetic algorithms. Lawrence Erlbaum Associates, Hillsdale, pp 41–49

  44. Gruau F, Whitley D, Pyeatt L (1996) A comparison between cellular encoding and direct encoding for genetic neural networks. Technical Report NC-TR-96-048, NeuroCOLT

  45. Gomez F, Miikkulainen R (1999) Solving non-Markovian control tasks with neuroevolution. In: Proceedings of the 16th international joint conference on artificial intelligence. Morgan Kaufmann, Denver

  46. Coons KE, Robatmili B, Taylor ME, Maher BA, Burger D, McKinley KS (2008) Feature selection and policy optimization for distributed instruction placement using reinforcement learning. In: Proceedings of the 7th international joint conference on parallel architectures and compilation techniques (PACT), Toronto, Ontario, Canada

  47. Skalak DB (1994) Prototype and feature selection by sampling and random mutation hill-climbing algorithms. In: Proceedings of the 11th international conference on machine learning. Morgan Kaufmann, New Brunswick, pp 293–301

  48. Mayr C (2003) NEAT Matlab. Available via http://www.cs.utexas.edu/~nn/soft-view.php?SoftID=23. Accessed 4 Sept 2008

  49. Ethembabaoglu A, Whiteson S (2008) Automatic feature selection using FS-NEAT. Technical report IAS-UVA-08-02, Intelligent Autonomous Systems Group, University of Amsterdam

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maxine Tan.

Appendix 1

Appendix 1

1.1 Common parameters for regular NEAT, FS-NEAT and FD-NEAT

The common parameters for regular NEAT, FS-NEAT and FD-NEAT, their definitions and values used in the RARS experiments are displayed in Table 2. The stagnation parameters to measure stagnation in species fitness were disabled in the experiments so that species would not die out.

Table 2 Common parameters for FS-NEAT, FD-NEAT and regular NEAT that were employed in the RARS experiments

1.2 Differing parameters

There are two parameters that differ between FS-NEAT, FD-NEAT and regular NEAT in the RARS experiments. These are (1) the probability of a connection gene being re-enabled in the offspring if it was inherited disabled (Gene Re-enable Prob.), and (2) the probability of a connection gene being disabled in the offspring if it was inherited enabled (Gene Disable Prob.). Gene Disable Prob. only acts on the input connections, and is only applicable to FD-NEAT. Gene Re-enable Prob. applies to all the connection genes, and is not only focused on the input connections. The values of each parameter for FS-NEAT, FD-NEAT and regular NEAT are displayed in Table 3. The values of the two parameters for FD-NEAT were obtained experimentally, namely, experimentations showed that FD-NEAT performed best when these values were used in the algorithm.

Table 3 Differing parameters for FS-NEAT, FD-NEAT and regular NEAT used in the RARS experiments

Rights and permissions

Reprints and permissions

About this article

Cite this article

Tan, M., Hartley, M., Bister, M. et al. Automated feature selection in neuroevolution. Evol. Intel. 1, 271–292 (2009). https://doi.org/10.1007/s12065-009-0018-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12065-009-0018-z

Keywords

Navigation