Abstract
Currently, most feature weights estimation methods are independent on the classification algorithms. The combination of discriminant analysis and classifiers for effective pattern classification remains heuristic. The present study address the topics of learning of feature weights by using a recently reported classification algorithm, K-Local Hyperplane Distance Nearest Neighbor (HKNN) [18], in which the data is modeled as embedded in a linear hyperplane. Motivated by the encouraging performance of the Learning Discriminative Projections and Prototypes, the feature weights are estimated by minimizing the classifier leave-one-out cross validation error of HKNN. Approximated explicit solution is obtained to give feature estimation. Therefore, the feature weighting and classification are perfectly matched. The performance of the combinational model is extensively assessed via experiments on both synthetic and benchmark datasets. The superior results demonstrate that the method is competitive compared with some state-of-art models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Asuncion, A., Newman, D.: UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Comput. 12(10), 2385–2404 (2000)
Cai, D., He, X., Zhou, K., Han, J., Bao, H.: Locality sensitive discriminant analysis. In: International Joint Conference on Artificial Intelligence, IJCAI 2007 (2007)
Chen, H.T., Chang, H.W., Liu, T.L.: Local discriminant embedding and its variants. In: Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), vol. 2, pp. 846–853. IEEE Computer Society, Washington, DC (2005)
Fukunaga, K.: Introduction to statistical pattern recognition, 2nd edn. Academic Press Professional, Inc., San Diego (1990)
He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.J.: Face recognition using laplacianfaces. IEEE Trans. Pattern Anal. Mach. Intell. 27(3), 328–340 (2005)
Kim, T.K., Kittler, J.: Locally linear discriminant analysis for multimodally distributed classes for face recognition with a single model image. IEEE Trans. Pattern Anal. Mach. Intell. 27(3), 318–327 (2005)
Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Mullers, K.R.: Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing IX, pp. 41–48 (August 1999)
Sugiyama, M.: Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis. Journal of Machine Learning Research 8, 1027–1061 (2007)
Sugiyama, M., Idé, T., Nakajima, S., Sese, J.: Semi-supervised local fisher discriminant analysis for dimensionality reduction. Machine Learning 78(1-2), 35–61 (2010)
Sun, Y.: Iterative relief for feature weighting: Algorithms, theories, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1035–1051 (2007)
Sun, Y., Todorovic, S., Goodison, S.: Local-learning-based feature selection for high-dimensional data analysis. IEEE Trans. Pattern Anal. Mach. Intell. 32(9), 1610–1626 (2010)
Sun, Y., Todorovic, S., Goodison, S.: Local-learning-based feature selection for high-dimensional data analysis. IEEE Trans. Pattern Anal. Mach. Intell. 32(9), 1610–1626 (2010)
Sun, Y., Wu, D.: A relief based feature extraction algorithm. In: SDM, pp. 188–195 (2008)
Tao, Y., Vojislav, K.: Adaptive local hyperplane classification. Neurocomputing 71(13-15), 3001–3004 (2008)
Villegas, M., Paredes, R.: Simultaneous learning of a discriminative projection and prototypes for nearest-neighbor classification. In: CVPR (2008)
Villegas, M., Paredes, R.: Dimensionality reduction by minimizing nearest-neighbor classification error. Pattern Recognition Letters 32(4), 633–639 (2011)
Vincent, P., Bengio, Y.: K-local hyperplane and convex distance nearest neighbor algorithms. In: Advances in Neural Information Processing Systems, pp. 985–992. The MIT Press (2001)
Wu, M., Schölkopf, B.: A local learning approach for clustering. In: NIPS, pp. 1529–1536 (2006)
Yan, S., Xu, D., Zhang, B., Zhang, H.J., Yang, Q., Lin, S.: Graph embedding and extensions: A general framework for dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 40–51 (2007)
Yang, J., Frangi, A.F., Yang, J.Y., Zhang, D., Jin, Z.: Kpca plus lda: A complete kernel fisher discriminant framework for feature extraction and recognition. IEEE Trans. Pattern Anal. Mach. Intell. 27(2), 230–244 (2005)
Yang, J., Zhang, L., Yang, J.Y., Zhang, D.: From classifiers to discriminators: A nearest neighbor rule induced discriminant analysis. Pattern Recognition 44(7), 1387–1402 (2011)
Zeng, H., Cheung, Y.M.: Feature selection and kernel learning for local learning-based clustering. IEEE Trans. Pattern Anal. Mach. Intell. 33(8), 1532–1547 (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cai, H., Ng, M. (2012). Optimal Combination of Feature Weight Learning and Classification Based on Local Approximation. In: Xiang, Y., Pathan, M., Tao, X., Wang, H. (eds) Data and Knowledge Engineering. ICDKE 2012. Lecture Notes in Computer Science, vol 7696. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34679-8_9
Download citation
DOI: https://doi.org/10.1007/978-3-642-34679-8_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-34678-1
Online ISBN: 978-3-642-34679-8
eBook Packages: Computer ScienceComputer Science (R0)