Processing math: 100%
Joint Sparse Locality Preserving Regression for Discriminative Learning | IEEE Journals & Magazine | IEEE Xplore

Joint Sparse Locality Preserving Regression for Discriminative Learning


Abstract:

Ridge Regression (RR) is a classical method that is widely used in multiple regression analysis. However, traditional RR does not take the local geometric structure of da...Show More

Abstract:

Ridge Regression (RR) is a classical method that is widely used in multiple regression analysis. However, traditional RR does not take the local geometric structure of data into consideration for discriminative learning and it is sensitive to outliers as it is based on L_{2}-norm. To address this problem, this article proposes a novel method called Joint Sparse Locality Preserving Regression (JSLPR) for discriminative learning. JSLPR not only applies L_{2,1}-norm on both loss function and regularization term but also takes the local geometric structure of the data into consideration. The use of L_{2,1}-norm can guarantee the robustness to outliers or noises and the joint sparsity for effective feature selection. Taking the local geometric structure into consideration can improve the performance of the feature extraction and selection method when the data lie on a manifold. To solve the optimization problem of JSLPR, an iterative algorithm is proposed and the convergence of the algorithm is also proven. Experiments on four famous face databases are conducted and the results show the merit of the proposed JSLPR on feature extraction and selection.
Page(s): 790 - 801
Date of Publication: 08 June 2023
Electronic ISSN: 2471-285X

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.