Loading [a11y]/accessibility-menu.js
Linear discriminant analysis with an information divergence criterion | IEEE Conference Publication | IEEE Xplore

Linear discriminant analysis with an information divergence criterion


Abstract:

Linear discriminant analysis seeks to find a one-dimensional projection of a dataset to alleviate the problems associated with classifying high-dimensional data. The earl...Show More

Abstract:

Linear discriminant analysis seeks to find a one-dimensional projection of a dataset to alleviate the problems associated with classifying high-dimensional data. The earliest methods, based on second-order statistics often fail on multimodal datasets. Information-theoretic criteria do not suffer in such cases, and allow for projections to spaces higher than one dimension and with multiple classes. These approaches are based on maximizing mutual information between the projected data and the labels. However, mutual information is computationally demanding and vulnerable to datasets with class imbalance. In this paper we propose an information-theoretic criterion for learning discriminants based on the Euclidean distance divergence between classes. This objective more directly seeks projections which separate classes and performs well in the midst of class imbalance. We demonstrate the effectiveness on real datasets, and provide extensions to the multi-class and multi-dimension cases.
Date of Conference: 12-17 July 2015
Date Added to IEEE Xplore: 01 October 2015
ISBN Information:

ISSN Information:

Conference Location: Killarney, Ireland

Contact IEEE to Subscribe

References

References is not available for this document.