Abstract
In this paper, a new feature transformation method is introduced to decrease misclassification rate. Linear classifiers in general are not able to classify feature vectors which lie in a high dimensional feature space. When the feature vectors from difference classes have underlying distributions which are severely overlapped, it is even more difficult to classify those feature vectors with desirable performance. In this case, data reduction or feature transformation typically finds a feature subspace in which feature vectors can be well separated. However, it is still not possible to overcome misclassifications which results from the overlapping area. The proposed feature transformation increases the dimension of a feature vector by combining other feature vectors in the same class and then follows typical data reduction process. Significantly improved separability in terms of linear classifiers is achieved through such a sequential process and is identified in the experimental results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Chou, W.: Discriminant-function-based minimum recognition error rate pattern-recognition approach to speech recognition. Proc. IEEE 88(8), 1201–1223 (2000)
Hastie, T., Tibshirani, R.: Discriminant adaptive nearest neighbor classification. IEEE Trans. Pattern Anal. Machine Intell. 18(6), 607–616 (1996)
Fisher, R.A.: The statistical utilization of multiple measurements. Annals of Eugenics 8, 376–386 (1938)
Buturovic, L.J.: Towards Bayes-optimal linear dimension reduction. IEEE Trans. Pattern Analysis and Machine Intelligence 16, 420–424 (1994)
Hastie, T., Tibshirani, R.: Discriminant analysis by Gaussian mixtures. J. Royal Statistics Soc., B 58, 155–176 (1996)
Rao, C.R.: The utilization of multiple measurements in problems of biological classification. J. Royal Statistical Soc., B 10, 159–203 (1948)
Hubert, M., Driessen, K.V.: Fast and robust discriminant analysis. Computational Statistics & Data Analysis 45(2), 301–320 (2004)
Poston, W.L., Marchette, D.J.: Recursive dimensionality reduction using Fisher’s linear discriminant. Pattern Recognition 31(7), 881–888 (1998)
Loog, M., Duin, R.P.W., Haeb-Umbach, R.: Multiclass linear dimension reduction by weighted pairwise Fisher criteria. IEEE Trans. Pattern Analysis and Machine Intelligence 23(7), 762–766 (2001)
Rueda, L., Oommen, B.J.: On optimal pairwise linear classifiers for normal distributions: The two-dimensional case. IEEE Trans. Pattern Analysis and Machine Intelligence 24(2), 274–280 (2002)
Brunzell, H., Eriksson, J.: Feature reduction for classification of multidimensional data. Pattern Recognition 33, 1741–1748 (2000)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, New York (2000)
Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: A review. IEEE Trans. Pattern Analysis and Machine Intelligence 22, 4–37 (2000)
UCI Repository of Machine Learning Databases (2004), http://www.ics.uci.edu/mlearn/mlrepository.html
Aladjem, M.: Linear discriminant analysis for two classes via removal of classification structure. IEEE Trans. Pattern Analysis and Machine Intelligence 19(2), 187–192 (1997)
Lotlikar, R., Kothari, R.: Adaptive linear dimensionality reduction for classification. Pattern Recognition 33(2), 177–350 (2000)
Du, Q., Chang, C.-I.: A linear constrained distance-based discriminant analysis for hyperspectral image classification. Pattern Recognition 34(2), 361–373 (2001)
Kshirsagar, A.M.: Multivariate Analysis. M. Dekker, New York (1972)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Bak, E. (2005). A New Multidimensional Feature Transformation for Linear Classifiers and Its Applications. In: Perner, P., Imiya, A. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2005. Lecture Notes in Computer Science(), vol 3587. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11510888_27
Download citation
DOI: https://doi.org/10.1007/11510888_27
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26923-6
Online ISBN: 978-3-540-31891-0
eBook Packages: Computer ScienceComputer Science (R0)