Loading [MathJax]/extensions/MathMenu.js
On the Benefits of Two Dimensional Metric Learning | IEEE Journals & Magazine | IEEE Xplore

Abstract:

In this paper, we study two dimensional metric learning (2DML) for matrix data from both theoretical and algorithmic perspectives. We first investigate the generalization...Show More

Abstract:

In this paper, we study two dimensional metric learning (2DML) for matrix data from both theoretical and algorithmic perspectives. We first investigate the generalization bounds of 2DML based on the notion of Rademacher complexity, which theoretically justifies the benefits of learning from matrices directly. Furthermore, we present a novel boosting-based algorithm that scales well with the feature dimension. Finally, we introduce an efficient rank-one correction algorithm, which is tailored to our boosting learning procedure to produce a low-rank solution to 2DML. As our algorithm works directly on the data in matrix representation, it scales well with the feature dimension, keeps the structure and dependence in the data, and has a more compact structure and much fewer parameters to optimize. Extensive evaluations on several benchmark data sets also empirically verify the effectiveness and efficiency of our algorithm.
Published in: IEEE Transactions on Knowledge and Data Engineering ( Volume: 35, Issue: 2, 01 February 2023)
Page(s): 1909 - 1921
Date of Publication: 27 July 2021

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.