Abstract:
Ridge Regression (RR) is a classical method that is widely used in multiple regression analysis. However, traditional RR does not take the local geometric structure of da...Show MoreMetadata
Abstract:
Ridge Regression (RR) is a classical method that is widely used in multiple regression analysis. However, traditional RR does not take the local geometric structure of data into consideration for discriminative learning and it is sensitive to outliers as it is based on L_{2}-norm. To address this problem, this article proposes a novel method called Joint Sparse Locality Preserving Regression (JSLPR) for discriminative learning. JSLPR not only applies L_{2,1}-norm on both loss function and regularization term but also takes the local geometric structure of the data into consideration. The use of L_{2,1}-norm can guarantee the robustness to outliers or noises and the joint sparsity for effective feature selection. Taking the local geometric structure into consideration can improve the performance of the feature extraction and selection method when the data lie on a manifold. To solve the optimization problem of JSLPR, an iterative algorithm is proposed and the convergence of the algorithm is also proven. Experiments on four famous face databases are conducted and the results show the merit of the proposed JSLPR on feature extraction and selection.
Published in: IEEE Transactions on Emerging Topics in Computational Intelligence ( Volume: 8, Issue: 1, February 2024)