Abstract
Feature weighting is an important task in data analyze, clustering and classification. Traditional algorithms focus on a common weight vector on the whole dataset which can easily lead to sensitiveness to the distribution of data. In contrast, a novel feature weighting algorithm called local feature weighting (LFW) that assign each sample a unique weight vector is proposed in this paper. We use clustering assumption to construct optimization task. Instead of considering the total intra-class and between-class features, we focus on the clustering performance on each training sample and the optimization goals are to minimize the total distances of a training sample to others in the same class and maximize the total distances in different classes. Data weight is added to the target function to emphasis nearby samples and finally use an iterative process to solve our problem. Experiments show that the new algorithm has a good performance on data classification. In addition, we provide a simple version of LFW which has less running time but with little accuracy loss.
The paper starting on page 293 of this volume has been retracted because a significant portion of the work was copied from the paper “Local Feature Selection for Data Classification” by Narges Armanfard, James P. Reilly, and Majid Komeili, published in 2016 in the IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 38. The erratum to this chapter is available at 10.1007/978-3-662-54395-5_26
This work was co-supported by Natural science fund project in Xinjiang (2014211A046) and the project of National Science and technology (grant no: 2014BAH13F02), and National Science and Technology Support program: Research and demonstration of virtual exhibition system for the spread of the special culture. (Project No. 2015BAK04B05).
An erratum to this chapter can be found at http://dx.doi.org/10.1007/978-3-662-54395-5_26
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Armanfard, N., Reilly, J.P., Komeili, M.: Local feature selection for data classification. IEEE Trans. Pattern Anal. Mach. Intell. 38(6), 1217–1227 (2016)
Peng, H., Long, F., Ding, C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)
Tahir, M.A., Bouridane, A., Kurugollu, F.: Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier. Pattern Recogn. Lett. 28(4), 438–446 (2007)
Huang, J.Z., Ng, M.K., Rong, H., Li, Z.: Automated variable weighting in k-means type clustering. IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 657–668 (2005)
Saha, A., Das, S.: Categorical fuzzy k-modes clustering with automated feature weight learning. Neurocomputing 166, 422–435 (2015)
Wang, L.: Feature selection with kernel class separability. IEEE Trans. Pattern Anal. Mach. Intell. 30(9), 1534–1546 (2008)
Lughofer, E.: On-line incremental feature weighting in evolving fuzzy classifiers. Fuzzy Sets Syst. 163(1), 1–23 (2011)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)
Sugiyama, M.: Local fisher discriminant analysis for supervised dimensionality reduction. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 905–912. ACM, June 2006
Sun, Y.: Iterative RELIEF for feature weighting: algorithms, theories, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1035–1051 (2007)
Chen, B., Liu, H., Chai, J., Bao, Z.: Large margin feature weighting method via linear programming. IEEE Trans. Knowl. Data Eng. 21(10), 1475–1488 (2009)
Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection-theory and algorithms. In: Proceedings of the Twenty-First International Conference on Machine Learning, p. 43. ACM, July 2004
Chai, J., Chen, H., Huang, L., Shang, F.: Maximum margin multiple-instance feature weighting. Pattern Recogn. 47(6), 2091–2103 (2014)
Lichman, M.: UCI Machine Learning Repository (2013). http://archive.ics.uci.edu/ml. Irvine, C.A.: University of California, School of Information and Computer Science
Sun, Y., Todorovic, S., Goodison, S.: Local-learning-based feature selection for high-dimensional data analysis. IEEE Ttrans. Pattern Anal. Mach. Intell. 32(9), 1610–1626 (2010)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer-Verlag GmbH Germany
About this chapter
Cite this chapter
Jia, G., Zhao, H., Pan, Z., Wang, L. (2017). RETRACTED CHAPTER: Local Feature Weighting for Data Classification. In: Pan, Z., Cheok, A., Müller, W., Zhang, M. (eds) Transactions on Edutainment XIII. Lecture Notes in Computer Science(), vol 10092. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-54395-5_25
Download citation
DOI: https://doi.org/10.1007/978-3-662-54395-5_25
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-54394-8
Online ISBN: 978-3-662-54395-5
eBook Packages: Computer ScienceComputer Science (R0)