Skip to main content

RETRACTED CHAPTER: Local Feature Weighting for Data Classification

  • Chapter
  • First Online:
Transactions on Edutainment XIII

Part of the book series: Lecture Notes in Computer Science ((TEDUTAIN,volume 10092))

Abstract

Feature weighting is an important task in data analyze, clustering and classification. Traditional algorithms focus on a common weight vector on the whole dataset which can easily lead to sensitiveness to the distribution of data. In contrast, a novel feature weighting algorithm called local feature weighting (LFW) that assign each sample a unique weight vector is proposed in this paper. We use clustering assumption to construct optimization task. Instead of considering the total intra-class and between-class features, we focus on the clustering performance on each training sample and the optimization goals are to minimize the total distances of a training sample to others in the same class and maximize the total distances in different classes. Data weight is added to the target function to emphasis nearby samples and finally use an iterative process to solve our problem. Experiments show that the new algorithm has a good performance on data classification. In addition, we provide a simple version of LFW which has less running time but with little accuracy loss.

The paper starting on page 293 of this volume has been retracted because a significant portion of the work was copied from the paper “Local Feature Selection for Data Classification” by Narges Armanfard, James P. Reilly, and Majid Komeili, published in 2016 in the IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 38. The erratum to this chapter is available at 10.1007/978-3-662-54395-5_26

This work was co-supported by Natural science fund project in Xinjiang (2014211A046) and the project of National Science and technology (grant no: 2014BAH13F02), and National Science and Technology Support program: Research and demonstration of virtual exhibition system for the spread of the special culture. (Project No. 2015BAK04B05).

An erratum to this chapter can be found at http://dx.doi.org/10.1007/978-3-662-54395-5_26

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Armanfard, N., Reilly, J.P., Komeili, M.: Local feature selection for data classification. IEEE Trans. Pattern Anal. Mach. Intell. 38(6), 1217–1227 (2016)

    Article  Google Scholar 

  2. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)

    Article  Google Scholar 

  3. Tahir, M.A., Bouridane, A., Kurugollu, F.: Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier. Pattern Recogn. Lett. 28(4), 438–446 (2007)

    Article  Google Scholar 

  4. Huang, J.Z., Ng, M.K., Rong, H., Li, Z.: Automated variable weighting in k-means type clustering. IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 657–668 (2005)

    Article  Google Scholar 

  5. Saha, A., Das, S.: Categorical fuzzy k-modes clustering with automated feature weight learning. Neurocomputing 166, 422–435 (2015)

    Article  Google Scholar 

  6. Wang, L.: Feature selection with kernel class separability. IEEE Trans. Pattern Anal. Mach. Intell. 30(9), 1534–1546 (2008)

    Article  Google Scholar 

  7. Lughofer, E.: On-line incremental feature weighting in evolving fuzzy classifiers. Fuzzy Sets Syst. 163(1), 1–23 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  8. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)

    Article  Google Scholar 

  9. Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  10. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)

    Article  MATH  Google Scholar 

  11. Sugiyama, M.: Local fisher discriminant analysis for supervised dimensionality reduction. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 905–912. ACM, June 2006

    Google Scholar 

  12. Sun, Y.: Iterative RELIEF for feature weighting: algorithms, theories, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1035–1051 (2007)

    Article  Google Scholar 

  13. Chen, B., Liu, H., Chai, J., Bao, Z.: Large margin feature weighting method via linear programming. IEEE Trans. Knowl. Data Eng. 21(10), 1475–1488 (2009)

    Article  Google Scholar 

  14. Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection-theory and algorithms. In: Proceedings of the Twenty-First International Conference on Machine Learning, p. 43. ACM, July 2004

    Google Scholar 

  15. Chai, J., Chen, H., Huang, L., Shang, F.: Maximum margin multiple-instance feature weighting. Pattern Recogn. 47(6), 2091–2103 (2014)

    Article  Google Scholar 

  16. Lichman, M.: UCI Machine Learning Repository (2013). http://archive.ics.uci.edu/ml. Irvine, C.A.: University of California, School of Information and Computer Science

  17. Sun, Y., Todorovic, S., Goodison, S.: Local-learning-based feature selection for high-dimensional data analysis. IEEE Ttrans. Pattern Anal. Mach. Intell. 32(9), 1610–1626 (2010)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haiying Zhao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer-Verlag GmbH Germany

About this chapter

Cite this chapter

Jia, G., Zhao, H., Pan, Z., Wang, L. (2017). RETRACTED CHAPTER: Local Feature Weighting for Data Classification. In: Pan, Z., Cheok, A., Müller, W., Zhang, M. (eds) Transactions on Edutainment XIII. Lecture Notes in Computer Science(), vol 10092. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-54395-5_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-54395-5_25

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-54394-8

  • Online ISBN: 978-3-662-54395-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics