Abstract:
In this letter, the theory of random matrices of increasing dimension is used to construct a form of regularized linear discriminant analysis (RLDA) that asymptotically y...Show MoreMetadata
Abstract:
In this letter, the theory of random matrices of increasing dimension is used to construct a form of regularized linear discriminant analysis (RLDA) that asymptotically yields the lowest overall risk with respect to the bias of the discriminant in cost-sensitive classification of two multivariate Gaussian distributions. Numerical experiments using both synthetic and real data show that even in finite-sample settings, the proposed classifier can uniformly outperform RLDA in terms of achieving a lower risk as a function of regularization parameter and misclassification costs.
Published in: IEEE Signal Processing Letters ( Volume: 26, Issue: 9, September 2019)