Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
G. John, R. Kohavi, and K. Phleger, âIrrelevant features and the feature subset problem,â in Proceedings of the 11th International Conference on Machine Learning, pp. 121-129, Morgan Kaufmann, San Francisco, CA 1994
R. Kohavi and G. John, âWrappers for feature subset selection,â Artificial Intelligence, vol. 97, no. 1-2, pp. 273-324, 1997
E. CantĂș-Paz, âFeature subset selection, class separability, and genetic algorithms,â in Genetic and Evolutionary Computation Conference -GECCO-2004, K. Deb et al., (Eds.), Springer, Berlin Heidelberg New York, 2004
I. Guyon and A. Elisseeff, âAn introduction to variable and feature selection,â Journal of Machine Learning Research, vol. 3, pp. 1157-1182, 2003
A. Jain and D. Zongker, âFeature selection: Evaluation, application and small sample performance,â IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-158, 1997
A. Blum and P. Langley, âSelection of relevant features and examples in machine learning,â Artificial Intelligence, vol. 97, no. 1-2, pp. 245-271, 1997
W. Siedlecki and J. Sklansky, âA note on genetic algorithms for largescale feature selection,â Pattern Recognition Letters, vol. 10, pp. 335-347, 1989
F. Z. Brill, D. E. Brown, and W. N. Martin, âGenetic algorithms for feature selection for counterpropagation networks,â Tech. Rep. No. IPC-TR-90-004, University of Virginia, Institute of Parallel Computation, Charlottesville, 1990
T. W. Brotherton and P. K. Simpson, âDynamic feature set training of neural nets for classification,â in Evolutionary Programming IV, J. R. McDonnell, R. G. Reynolds, and D. B. Fogel, (Eds.), pp. 83-94, MIT Cambridge, MA, 1995
J. Bala, K. De Jong, J. Huang, H. Vafaie, and H. Wechsler, âUsing learning to facilitate the evolution of features for recognizing visual concepts,â Evolutionary Computation, vol. 4, no. 3, pp. 297-311, 1996
J. D. Kelly and L. Davis, âHybridizing the genetic algorithm and the K nearest neighbors classification algorithm,â in Proceedings of the Fourth International Conference on Genetic Algorithms, R. K. Belew and L. B. Booker, (Eds.), pp. 377-383, Morgan Kaufmann, San Mateo, CA, 1991
W. F. Punch, E. D. Goodman, M. Pei, L. Chia-Shun, P. Hovland, and R. Enbody, âFurther research on feature selection and classification using genetic algorithms,â in Proceedings of the Fifth International Conference on Genetic Algorithms, S. Forrest, (Ed.), pp. 557-564, Morgan Kaufmann, San Mateo, CA, 1993
M. L. Raymer, W. F. Punch, E. D. Goodman, P. C. Sanschagrin, and L. A. Kuhn, âSimultaneous feature scaling and selection using a genetic algorithm,â in Proceedings of the Seventh International Conference on Genetic Algorithms, T. BĂ€ck, (Ed.), pp. 561-567, Morgan Kaufmann, San Francisco, CA, 1997
M. Kudo and K. Sklansky, âComparison of algorithms that select features for pattern classifiers,â Pattern Recognition, vol. 33, no. 1, pp. 25-41, 2000
H. Vafaie and K. A. De Jong, âRobust feature selection algorithms,â in Proceedings of the International Conference on Tools with Artificial Intelligence. pp. 356-364, IEEE Computer Society, USA 1993
I. Inza, P. Larrañaga, R. Etxeberria, and B. Sierra, âFeature subset se-lection by Bayesian networks based optimization,â Artificial Intelligence, vol. 123, no. 1-2, pp. 157-184, 1999
Erick CantĂș-Paz, âFeature subset selection by estimation of distribution algorithms,â in GECCO 2002: Proceedings of the Genetic and Evolution-ary Computation Conference, W. B. Langdon, E. CantĂș-Paz, K. Mathias, R. Roy, D. Davis, R. Poli, K. Balakrishnan, V. Honavar, G. Rudolph, J. Wegener, L. Bull, M. A. Potter, A. C. Schultz, J. F. Miller, E. Burke, and N. Jonoska, (Eds.), pp. 303-310, Morgan Kaufmann, San Francisco, CA, 2002
M. L. Raymer, W. F. Punch, E. D. Goodman, L. A. Kuhn, and A. K. Jain, âDimensionality reduction using genetic algorithms,â IEEE Transactions on Evolutionary Computation, vol. 4, no. 2, pp. 164-171, 2000
I. Inza, P. Larrañaga, and B. Sierra, âFeature subset selection by Bayesian networks: A comparison with genetic and sequential algorithms,â International Journal of Approximate Reasoning, vol. 27, no. 2, pp. 143-164, 2001
I. Inza, P. Larrañaga, and B. Sierra, âFeature subset selection by estimation of distribution algorithms,â in Estimation of Distribution Algorithms: A new tool for Evolutionary Computation, P. Larrañaga and J. A. Lozano, (Eds.), Kluwer Academic, Dordrecht Hingham, MA 2001
M. Ozdemir, M. J. Embrechts, F. Arciniegas, C. M. Breneman, L. Lock-wood, and K. P. Bennett, âFeature selection for in-silico drug design using genetic algorithms and neural networks,â in IEEE Mountain Workshop on Soft Computing in Industrial Applications. pp. 53-57, IEEE, USA 2001
P.L. Lanzi, âFast feature selection with genetic algorithms: A wrapper approach,â in IEEE International Conference on Evolutionary Computation. pp. 537-540, IEEE, USA 1997
I.-S. Oh, J.-S. Lee, and C. Suen,âAnalysis of class separation and combination of class-dependent features for handwritting recognition,â IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, no. 10, pp. 1089-1094, 1999
C.L. Blake and C.J. Merz, âUCI repository of machine learning databases,â 1998
J. R. Quinlan, âInduction of decision trees,â Machine Learning, vol. 1, pp. 81-106, 1986
G. Harik, E. CantĂș-Paz, D. E. Goldberg, and B. L. Miller, âThe gamblerâs ruin problem, genetic algorithms, and the sizing of populations,â Evolutionary Computation, vol. 7, no. 3, pp. 231-253, 1999
G. R. Harik, F. G. Lobo, and D. E. Goldberg, âThe compact genetic algorithm,â in Proceedings of 1998 IEEE Iternational Conference on Evolutionary Computation, Institute of Electrical and Electronics Engineers, pp. 523-528, IEEE Service Center, Piscataway, NJ, 1998
S. Baluja, âPopulation-based incremental learning: A method for integrating genetic search based function optimization and competitive learning,â Tech. Rep. No. CMU-CS-94-163, Carnegie Mellon University, Pittsburgh, PA, 1994
H. MĂŒhlenbein, âThe equation for the response to selection and its use for prediction,â Evolutionary Computation, vol. 5, no. 3, pp. 303-346, 1998
Sanmay Das, âFilters, wrappers and a boosting-based hybrid for feature selection,â in Proceedings of the 18th International Conference on Machine Learning, Carla Brodley and Andrea Danyluk, (Eds.), pp. 74-81, Morgan Kaufmann, San Francisco, CA, 2001
Y. Freund and R. E. Schapire, âExperiments with a new boosting algorithm,â in Proceedings of the Thirteenth International Conference on Machine Learning, L. Saitta, (Ed.), pp. 148-156, Morgan Kaufmann, San Mateo, CA, 1996
M. Matsumoto and T. Nishimura,âMersenne twister: A623-dimensionally equidistributed uniform pseudorandom number generator,â ACM Transactions on Modeling and Computer Simulation, vol. 8, no. 1, pp. 3-30, 1998
B. L. Miller and D. E. Goldberg, âGenetic algorithms, selection schemes, and the varying effects of noise,â Evolutionary Computation, vol. 4, no. 2, pp. 113-131, 1996
T. G. Dietterich, âApproximate statistical tests for comparing supervised classification learning algorithms,â Neural Computation, vol. 10, no. 7, pp. 1895-1924, 1998
E. Alpaydin, âCombined 5Ă2 cv F test for comparing supervised classification algorithms,â Neural Computation, vol. 11, pp. 1885-1892, 1999
J. Reunanen, âOverfitting in making comparisons between variable selec-tion methods,â Journal of Machine Learning Research, vol. 3, pp. 1371-1382,2003
C. Ambroise and G. J. McLachlan, âSelection bias in gene extraction on the basis of microarray gene-expression data,â Proceedings of the National Academy of Sciences, vol. 99, no. 10, pp. 6562-6566, 2002
E. Cantu-Paz and C. Kamath, âOn the use of evolutionary algorithms in data mining,â in Data Mining: A Heuristic Approach, H. Abbass, R. Sarker, and C. Newton, (Eds.), pp. 48-71. IDEA Group, Hershey, PA, 2002
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
CantĂș-Paz, E. (2006). Feature Subset Selection with Hybrids of Filters and Evolutionary Algorithms. In: Pelikan, M., Sastry, K., CantĂșPaz, E. (eds) Scalable Optimization via Probabilistic Modeling. Studies in Computational Intelligence, vol 33. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-34954-9_13
Download citation
DOI: https://doi.org/10.1007/978-3-540-34954-9_13
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34953-2
Online ISBN: 978-3-540-34954-9
eBook Packages: EngineeringEngineering (R0)