Abstract
We propose and analyze new fast feature weighting algorithms based on different types of feature ranking. Feature weighting may be much faster than feature selection because there is no need to find cut-threshold in the raking. Presented weighting schemes may be combined with several distance based classifiers like SVM, kNN or RBF network (and not only). Results shows that such method can be successfully used with classifiers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Aha, D.W., Goldstone, R.: Concept learning and flexible weighting. In: Proceedings of the 14th Annual Conference of the Cognitive Science Society, pp. 534–539 (1992)
Asuncion, A., Newman, D.: UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
Battiti, R.: Using mutual information for selecting features in supervised neural net learning. IEEE Transactions on Neural Networks 5(4), 537–550 (1994)
Breiman, L., Friedman, J.H., Olshen, A., Stone, C.J.: Classification and regression trees. Wadsworth, Belmont (1984)
Creecy, R.H., Masand, B.M., Smith, S.J., Waltz, D.L.: Trading mips and memory for knowledge engineering. Communications of the ACM 35, 48–64 (1992)
Daelemans, W., van den Bosch, A.: Generalization performance of backpropagation learning on a syllabification task. In: Proceedings of TWLT3: Connectionism and Natural Language Processing, pp. 27–37 (1992)
Duch, W.: Filter methods. In: Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L. (eds.) Feature Extraction, Foundations and Applications. Studies in fuzziness and soft computing, pp. 89–117. Springer, Heidelberg (2006)
Duch, W., Biesiada, T.W.J., Blachnik, M.: Comparison of feature ranking methods based on information entropy. In: Proceedings of International Joint Conference on Neural Networks, pp. 1415–1419. IEEE Press (2004)
Wettschereck, D., Aha, D., Mohri, T.: A review of empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Artificial Intelligence Review Journal 11, 273–314 (1997)
Grąbczewski, K., Duch, W.: The separability of split value criterion. In: Rutkowski, L., Tadeusiewicz, R. (eds.) Neural Networks and Soft Computing, Zakopane, Poland, pp. 202–208 (June 2000)
Grąbczewski, K., Jankowski, N.: Feature selection with decision tree criterion. In: Nedjah, N., Mourelle, L., Vellasco, M., Abraham, A., Köppen, M. (eds.) Fifth International conference on Hybrid Intelligent Systems, pp. 212–217. IEEE Computer Society, Brasil (2005)
Grąbczewski, K., Jankowski, N.: Mining for complex models comprising feature selection and classification. In: Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L. (eds.) Feature Extraction, Foundations and Applications. Studies in fuzziness and soft computing, pp. 473–489. Springer, Heidelberg (2006)
Guyon, I.: Practical feature selection: from correlation to causality. 955 Creston Road, Berkeley, CA 94708, USA (2008), http://eprints.pascal-network.org/archive/00004038/01/PracticalFS.pdf
Guyon, I., Elisseef, A.: An introduction to variable and feature selection. Journal of Machine Learning Research, 1157–1182 (2003)
Hall, M.A.: Correlation-based feature subset selection for machine learning. Ph.D. thesis, Department of Computer Science, University of Waikato, Waikato, New Zealand (1999)
Huang, J.J., Cai, Y.Z., Xu, X.M.: A parameterless feature ranking algorithm based on MI. Neurocomputing 71, 1656–1668 (2007)
Jankowski, N.: Discrete quasi-gradient features weighting algorithm. In: Rutkowski, L., Kacprzyk, J. (eds.) Neural Networks and Soft Computing. Advances in Soft Computing, pp. 194–199. Springer, Zakopane (2002)
Kelly, J.D., Davis, L.: A hybrid genetic algorithm for classification. In: Proceedings of the 12th International Joint Conference on Artificial Intelligence, pp. 645–650 (1991)
Kira, K., Rendell, L.A.: The feature selection problem: Traditional methods and a new algorithm. In: Proceedings of the 10th International Joint Conference on Artificial Intelligence, pp. 129–134 (1992)
Quinlan, J.R.: C4.5: Programs for machine learning. Morgan Kaufmann, San Mateo (1993)
Salzberg, S.L.: A nearest hyperrectangle learning method. Machine Learning Journal 6(3), 251–276 (1991)
Setiono, R., Liu, H.: Improving backpropagation learning with feature selection. Applied Intelligence 6, 129–139 (1996)
Shannon, C.E.: A mathematical theory of communication. The Bell System Technical Journal 27, 379–423, 623–656 (1948)
Vivencio, D.P., Hruschka Jr., E.R., Nicoletti, M., Santos, E., Galvao, S.: Feature-weighted k-nearest neigbor classifier. In: Proceedings of IEEE Symposium on Foundations of Computational Intelligence (2007)
Yan, W.: Fusion in multi-criterion feature ranking. In: 10th International Conference on Information Fusion, pp. 1–6 (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jankowski, N., Usowicz, K. (2011). Analysis of Feature Weighting Methods Based on Feature Ranking Methods for Classification. In: Lu, BL., Zhang, L., Kwok, J. (eds) Neural Information Processing. ICONIP 2011. Lecture Notes in Computer Science, vol 7063. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24958-7_28
Download citation
DOI: https://doi.org/10.1007/978-3-642-24958-7_28
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-24957-0
Online ISBN: 978-3-642-24958-7
eBook Packages: Computer ScienceComputer Science (R0)