skip to main content
10.1145/2393216.2393219acmotherconferencesArticle/Chapter ViewAbstractPublication PagesccseitConference Proceedingsconference-collections
research-article

Harmony-based feature selection to improve the nearest neighbor classification

Published: 26 October 2012 Publication History

Abstract

A new approach for feature selection is presented in this paper. The proposed approach uses the Harmony Search with a novel fitness function to eliminate noisy and irrelevant features. Harmony vectors contain real weights which refer to feature space. The best and significant features are selected according to a threshold. Fitness function of Harmony Search is based on the Area Under the receiver operating characteristics Curve (AUC). All of the selected features are employed to improve the classification of the k Nearest Neighbor (k-NN) classifier. Experimental results claim that the proposed method is able to improve the classification performance of k-NN algorithm in comparison with the other important methods in realm of feature selection such as BAHSIC, FSS, BSS and MFS.

References

[1]
Aha, D. W., Bankert, R. L.: Feature Selection for Case-based classification of cloud types: An empirical comparison. AAAI Press, pp. 106--112.
[2]
Blake L. and Merz C. J. UCI repository of machine learning databases Internet: http://www.ics.uci.edu/~mlearn/, MLRepository.html, Oct. 25, 2000 {Nov. 10, 2011}.
[3]
Das S. (2001). Filters, wrappers and a boosting-based hybrid for feature selection. In Proc. ICML'01. pp. 74--81.
[4]
Das S., Mukhopadhyay A., Roy A., Abraham A. and Panigrahi B. K. (2011). Exploratory power of the harmony search algorithm: analysis and improvements for global numerical optimization. IEEE transactions on systems man and cybernetics Part B Cybernetics, 41(1), pp. 89--106,
[5]
Duda R. O. and Hart P. E. (1973). Pattern Classification and Scene Analysis," Wiley, Vol. 7.
[6]
Fawcett T. (2004). Roc graphs: Notes and practical considerations for researchers. Intelligent Enterprise 31(HPL-2003-4), Vol. 28, pp.1--38.
[7]
Frohlich H., Chapelle O. and Scholkopf B. (2003). Feature selection for support vector machines by means of genetic algorithms. In Proc. 15th IEEE Int. Conf. on Tools with AI, pp. 142--148.
[8]
Hall M. A. (1999). Correlation-based Feature Subset Selection for Machine Learning. Ph.D. thesis, Dep. of Computer Science, University of Waikato, New Zealand.
[9]
Pudil, P., Novovicová, J., Kittler, J. (1994). Floating search methods in feature selection. Pattern Recognition Letters 15(11), pp. 1119--1125.
[10]
Song, L., Smola, A. J., Gretton, A., Borgwardt, K. M. and Bedo, J. (2007). Supervised feature selection via dependence estimation. ICML'07, pp. 823--830
[11]
Bay, S. D. (1998). Combining nearest neighbor classifiers through multiple feature subsets. In: Proceedings of the Fifteenth International Conference on Machine Learning, ICML 1998, pp. 37--45.
[12]
P. L. Lanzi, "Fast feature selection with genetic algorithms: a filter approach", In Proc. IEEE Int. Conf. on Evolutionary Computation, pp. 537--540, Apr. 1997.
[13]
Liu H. and Yu L. (2005). Toward integrating feature selection algorithms for classification and clustering. IEEE Transactions on Knowledge and Data Engineering, 17(4), pp. 491--502.
[14]
Mahdavi M., Fesanghary M. and Damangir E. (2007). An improved harmony search algorithm for solving optimization problems. Applied Mathematics and Computation, 188(2), pp. 1567--1579.
[15]
Marill T. and Green D. (1963). On the effectiveness of receptors in recognition systems. IEEE Transactions on Information Theory, 9(1), pp. 11--17.
[16]
Muni D. P., Pal N. R and Das J. (2006). Genetic programming for simultaneous feature selection and classifier design. IEEE transactions on systems man and cybernetics Part B Cybernetics, 36(1), pp. 106--117.
[17]
Vivencio, D., Hruschka, E., Nicoletti, M., dos Santos, E. and Galvao, S. (2007). Feature-weighted k-nearest neighbor classifier. In: FOCI 2007, pp. 481--486.
[18]
Zomorodian, M. J., Adeli, A., Sinaee, M., Hashemi, S. (2012), Improving Nearest Neighbor Classification by Elimination of Noisy Irrelevant Features. ACIIDS 2012, Part II, LNAI 7197, pp. 11--21.

Index Terms

  1. Harmony-based feature selection to improve the nearest neighbor classification

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    CCSEIT '12: Proceedings of the Second International Conference on Computational Science, Engineering and Information Technology
    October 2012
    800 pages
    ISBN:9781450313100
    DOI:10.1145/2393216
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • Avinashilingam University: Avinashilingam University

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 October 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. k-NN
    2. AUC
    3. feature selection
    4. harmony search
    5. noisy feature elimination

    Qualifiers

    • Research-article

    Conference

    CCSEIT '12
    Sponsor:
    • Avinashilingam University

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 79
      Total Downloads
    • Downloads (Last 12 months)2
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 03 Mar 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media