Skip to main content

Optimal Combination of Feature Weight Learning and Classification Based on Local Approximation

  • Conference paper
Data and Knowledge Engineering (ICDKE 2012)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7696))

Included in the following conference series:

  • 995 Accesses

Abstract

Currently, most feature weights estimation methods are independent on the classification algorithms. The combination of discriminant analysis and classifiers for effective pattern classification remains heuristic. The present study address the topics of learning of feature weights by using a recently reported classification algorithm, K-Local Hyperplane Distance Nearest Neighbor (HKNN) [18], in which the data is modeled as embedded in a linear hyperplane. Motivated by the encouraging performance of the Learning Discriminative Projections and Prototypes, the feature weights are estimated by minimizing the classifier leave-one-out cross validation error of HKNN. Approximated explicit solution is obtained to give feature estimation. Therefore, the feature weighting and classification are perfectly matched. The performance of the combinational model is extensively assessed via experiments on both synthetic and benchmark datasets. The superior results demonstrate that the method is competitive compared with some state-of-art models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 72.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Asuncion, A., Newman, D.: UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html

  2. Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Comput. 12(10), 2385–2404 (2000)

    Article  Google Scholar 

  3. Cai, D., He, X., Zhou, K., Han, J., Bao, H.: Locality sensitive discriminant analysis. In: International Joint Conference on Artificial Intelligence, IJCAI 2007 (2007)

    Google Scholar 

  4. Chen, H.T., Chang, H.W., Liu, T.L.: Local discriminant embedding and its variants. In: Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), vol. 2, pp. 846–853. IEEE Computer Society, Washington, DC (2005)

    Chapter  Google Scholar 

  5. Fukunaga, K.: Introduction to statistical pattern recognition, 2nd edn. Academic Press Professional, Inc., San Diego (1990)

    MATH  Google Scholar 

  6. He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.J.: Face recognition using laplacianfaces. IEEE Trans. Pattern Anal. Mach. Intell. 27(3), 328–340 (2005)

    Article  Google Scholar 

  7. Kim, T.K., Kittler, J.: Locally linear discriminant analysis for multimodally distributed classes for face recognition with a single model image. IEEE Trans. Pattern Anal. Mach. Intell. 27(3), 318–327 (2005)

    Article  Google Scholar 

  8. Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Mullers, K.R.: Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing IX, pp. 41–48 (August 1999)

    Google Scholar 

  9. Sugiyama, M.: Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis. Journal of Machine Learning Research 8, 1027–1061 (2007)

    MATH  Google Scholar 

  10. Sugiyama, M., Idé, T., Nakajima, S., Sese, J.: Semi-supervised local fisher discriminant analysis for dimensionality reduction. Machine Learning 78(1-2), 35–61 (2010)

    Article  Google Scholar 

  11. Sun, Y.: Iterative relief for feature weighting: Algorithms, theories, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1035–1051 (2007)

    Article  Google Scholar 

  12. Sun, Y., Todorovic, S., Goodison, S.: Local-learning-based feature selection for high-dimensional data analysis. IEEE Trans. Pattern Anal. Mach. Intell. 32(9), 1610–1626 (2010)

    Article  Google Scholar 

  13. Sun, Y., Todorovic, S., Goodison, S.: Local-learning-based feature selection for high-dimensional data analysis. IEEE Trans. Pattern Anal. Mach. Intell. 32(9), 1610–1626 (2010)

    Article  Google Scholar 

  14. Sun, Y., Wu, D.: A relief based feature extraction algorithm. In: SDM, pp. 188–195 (2008)

    Google Scholar 

  15. Tao, Y., Vojislav, K.: Adaptive local hyperplane classification. Neurocomputing 71(13-15), 3001–3004 (2008)

    Article  Google Scholar 

  16. Villegas, M., Paredes, R.: Simultaneous learning of a discriminative projection and prototypes for nearest-neighbor classification. In: CVPR (2008)

    Google Scholar 

  17. Villegas, M., Paredes, R.: Dimensionality reduction by minimizing nearest-neighbor classification error. Pattern Recognition Letters 32(4), 633–639 (2011)

    Article  Google Scholar 

  18. Vincent, P., Bengio, Y.: K-local hyperplane and convex distance nearest neighbor algorithms. In: Advances in Neural Information Processing Systems, pp. 985–992. The MIT Press (2001)

    Google Scholar 

  19. Wu, M., Schölkopf, B.: A local learning approach for clustering. In: NIPS, pp. 1529–1536 (2006)

    Google Scholar 

  20. Yan, S., Xu, D., Zhang, B., Zhang, H.J., Yang, Q., Lin, S.: Graph embedding and extensions: A general framework for dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 40–51 (2007)

    Article  Google Scholar 

  21. Yang, J., Frangi, A.F., Yang, J.Y., Zhang, D., Jin, Z.: Kpca plus lda: A complete kernel fisher discriminant framework for feature extraction and recognition. IEEE Trans. Pattern Anal. Mach. Intell. 27(2), 230–244 (2005)

    Article  Google Scholar 

  22. Yang, J., Zhang, L., Yang, J.Y., Zhang, D.: From classifiers to discriminators: A nearest neighbor rule induced discriminant analysis. Pattern Recognition 44(7), 1387–1402 (2011)

    Article  MATH  Google Scholar 

  23. Zeng, H., Cheung, Y.M.: Feature selection and kernel learning for local learning-based clustering. IEEE Trans. Pattern Anal. Mach. Intell. 33(8), 1532–1547 (2011)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Cai, H., Ng, M. (2012). Optimal Combination of Feature Weight Learning and Classification Based on Local Approximation. In: Xiang, Y., Pathan, M., Tao, X., Wang, H. (eds) Data and Knowledge Engineering. ICDKE 2012. Lecture Notes in Computer Science, vol 7696. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34679-8_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-34679-8_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-34678-1

  • Online ISBN: 978-3-642-34679-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics