skip to main content
10.1145/1150402.1150453acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
Article

Regularized discriminant analysis for high dimensional, low sample size data

Published: 20 August 2006 Publication History

Abstract

Linear and Quadratic Discriminant Analysis have been used widely in many areas of data mining, machine learning, and bioinformatics. Friedman proposed a compromise between Linear and Quadratic Discriminant Analysis, called Regularized Discriminant Analysis (RDA), which has been shown to be more flexible in dealing with various class distributions. RDA applies the regularization techniques by employing two regularization parameters, which are chosen to jointly maximize the classification performance. The optimal pair of parameters is commonly estimated via cross-validation from a set of candidate pairs. It is computationally prohibitive for high dimensional data, especially when the candidate set is large, which limits the applications of RDA to low dimensional data.In this paper, a novel algorithm for RDA is presented for high dimensional data. It can estimate the optimal regularization parameters from a large set of parameter candidates efficiently. Experiments on a variety of datasets confirm the claimed theoretical estimate of the efficiency, and also show that, for a properly chosen pair of regularization parameters, RDA performs favorably in classification, in comparison with other existing classification methods.

References

[1]
P. N. Belhumeour, J. P. Hespanha, and D. J. Kriegman. Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. IEEE Trans Pattern Analysis and Machine Intelligence, 19(7):711--720, 1997.]]
[2]
C. J. C. Burges. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2(2):121--167, 1998.]]
[3]
L. F. Chen, H. Y. M. Liao, M. T. Ko, J. C. Lin, and G. J. Yu. A new LDA-based face recognition system which can solve the small sample size problem. Pattern Recognition, 33:1713--1726, 2000.]]
[4]
N. Cristianini and J. S. Taylor. Support Vector Machines and other Kernel-based Learning Methods. Cambridge University Press, 2000.]]
[5]
R. O. Duda, P. E. Hart, and D. Stork. Pattern Classification. Wiley, 2000.]]
[6]
S. Dudoit, J. Fridlyand, and T. P. Speed. Comparison of discrimination methods for the classification of tumors using gene expression data. Journal of the American Statistical Association, 97(457):77--87, 2002.]]
[7]
R. A. Fisher. The use of multiple measurements in taxonomic problems. Annals of Eugenics, 7:179--188, 1936.]]
[8]
J. H. Friedman. Regularized discriminant analysis. Journal of the American Statistical Association, 84(405):165--175, 1989.]]
[9]
K. Fukunaga. Introduction to Statistical Pattern Classification. Academic Press, USA, 1990.]]
[10]
G. H. Golub and C. F. Van Loan. Matrix Computations. The Johns Hopkins University Press, USA, third edition, 1996.]]
[11]
T. R. Golub et al. Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. Science, 286:531--537, 1999.]]
[12]
U. Grouven, F. Bergel, and A. Schultz. Implementation of linear and quadratic discriminant analysis incorporating costs of misclassification. Computer Methods and Programs in Biomedicine, 49(1):55--60, 1996.]]
[13]
P. Hall, J. S. Marron, and A. Neeman. Geometric representation of high dimension, low sample size data. Journal of the Royal Statistical Society series B, 67:427--444, 2005.]]
[14]
T. Hastie, R. Tibshirani, and J.H. Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, 2001.]]
[15]
A. Hoerl and R. Kennard. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(3):55--67, 1970.]]
[16]
Z. Jin, J. Y. Yang, Z. S. Hu, and Z. Lou. Face recognition based on the uncorrelated discriminant transformation. Pattern Recognition, 34:1405--1416, 2001.]]
[17]
W. J. Krzanowski, P. Jonathan, W. V McCarthy, and M. R. Thomas. Discriminant analysis with singular covariance matrices: methods and applications to spectroscopic data. Applied Statistics, 44:101--115, 1995.]]
[18]
D. D. Lewis. Reuters-21578 text categorization test collection distribution 1.0. http://www.research.att.com/~lewis, 1999.]]
[19]
D. L. Swets and J. Weng. Using discriminant eigenfeatures for image retrieval. IEEE Trans Pattern Analysis and Machine Intelligence, 18(8):831--836, 1996.]]
[20]
A. N. Tikhonov and V. Y. Arsenin. Solutions of Ill-posed problems. John Wiley and Sons, Washington D.C., 1977.]]
[21]
V. N. Vapnik. Statistical Learning Theory. Wiley, 1998.]]
[22]
J. Ye. Characterization of a family of algorithms for generalized discriminant analysis on undersampled problems. Journal of Machine Learning Research, 6:483--502, 2005.]]
[23]
J. Ye, R. Janardan, Q. Li, and H. Park. Feature extraction via generalized uncorrelated linear discriminant analysis. In ICML Conference Proceedings, 2004.]]
[24]
J. Ye, T. Li, T. Xiong, and R. Janardan. Using uncorrelated discriminant analysis for tissue classification with gene expression data. IEEE/ACM Trans. Computational Biology and Bioinformatics, 1(4):181--190, 2004.]]
[25]
E. J. Yeoh et al. Classification, subtype discovery, and prediction of outcome in pediatric lymphoblastic leukemia by gene expression profiling. Cancer Cell, 1(2):133--143, 2002.]]
[26]
L. Zhang and L. Luo. Splice site prediction with quadratic discriminant analysis using diversity measure. Nucleic Acids Research, 31(21):6214--6220, 2003.]]
[27]
M. Zhang. Identification of protein coding regions in the human genome by quadratic discriminant analysis. Proceedings of the National Academy of Sciences, USA, 94:565--568, 1997.]]

Cited By

View all
  • (2024)Artificial Intelligence and Machine Learning for Rice ImprovementClimate-Smart Rice Breeding10.1007/978-981-97-7098-4_11(273-300)Online publication date: 16-Nov-2024
  • (2023)A Classification Study in High-Dimensional Data of Linear Discriminant Analysis and Regularized Discriminant AnalysisWSEAS TRANSACTIONS ON MATHEMATICS10.37394/23206.2023.22.3722(315-323)Online publication date: 10-May-2023
  • (2022)Applying antagonistic activation pattern to the single-trial classification of mental arithmeticHeliyon10.1016/j.heliyon.2022.e11102(e11102)Online publication date: Oct-2022
  • Show More Cited By

Index Terms

  1. Regularized discriminant analysis for high dimensional, low sample size data

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    KDD '06: Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
    August 2006
    986 pages
    ISBN:1595933395
    DOI:10.1145/1150402
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 August 2006

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. cross-validation
    2. dimensionality reduction
    3. quadratic discriminant analysis
    4. regularization

    Qualifiers

    • Article

    Conference

    KDD06

    Acceptance Rates

    Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

    Upcoming Conference

    KDD '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)9
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 19 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Artificial Intelligence and Machine Learning for Rice ImprovementClimate-Smart Rice Breeding10.1007/978-981-97-7098-4_11(273-300)Online publication date: 16-Nov-2024
    • (2023)A Classification Study in High-Dimensional Data of Linear Discriminant Analysis and Regularized Discriminant AnalysisWSEAS TRANSACTIONS ON MATHEMATICS10.37394/23206.2023.22.3722(315-323)Online publication date: 10-May-2023
    • (2022)Applying antagonistic activation pattern to the single-trial classification of mental arithmeticHeliyon10.1016/j.heliyon.2022.e11102(e11102)Online publication date: Oct-2022
    • (2022)2D-DOST for seizure identification from brain MRI during pregnancy using KRVFLHealth and Technology10.1007/s12553-022-00669-412:4(757-764)Online publication date: 21-Apr-2022
    • (2020)Optimization of Text Feature Selection Process Based on Advanced Searching for News ClassificationInternational Journal of Swarm Intelligence Research10.4018/IJSIR.202010010111:4(1-23)Online publication date: Oct-2020
    • (2020)High-Dimensional Quadratic Discriminant Analysis Under Spiked Covariance ModelIEEE Access10.1109/ACCESS.2020.30048128(117313-117323)Online publication date: 2020
    • (2018)Deep Max-Margin Discriminant ProjectionIEEE Transactions on Cybernetics10.1109/TCYB.2018.2831792(1-13)Online publication date: 2018
    • (2016)Nonlinear discriminant analysis based on vanishing component analysisNeurocomputing10.1016/j.neucom.2016.08.058218:C(172-184)Online publication date: 19-Dec-2016
    • (2015)Sample Weighting: An Inherent Approach for Outlier Suppressing Discriminant AnalysisIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2015.244854727:11(3070-3083)Online publication date: 1-Nov-2015
    • (2015)Max-Margin Discriminant Projection via Data AugmentationIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2015.239744427:7(1964-1976)Online publication date: 1-Jul-2015
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media