Loading [a11y]/accessibility-menu.js
Self-Paced Enhanced Low-Rank Tensor Kernelized Multi-View Subspace Clustering | IEEE Journals & Magazine | IEEE Xplore

Self-Paced Enhanced Low-Rank Tensor Kernelized Multi-View Subspace Clustering


Abstract:

This paper addresses the multi-view subspace clustering problem and proposes the self-paced enhanced low-rank tensor kernelized multi-view subspace clustering (SETKMC) me...Show More

Abstract:

This paper addresses the multi-view subspace clustering problem and proposes the self-paced enhanced low-rank tensor kernelized multi-view subspace clustering (SETKMC) method, which is based on two motivations: (1) singular values of the representations and multiple instances should be treated differently. The reasons are that larger singular values of the representations usually quantify the major information and should be less penalized; samples with different degrees of noise may have various reliability for clustering. (2) many existing methods may cause the degraded performance when multi-view features reside in different nonlinear subspaces. This is because they usually assumed that multiple features lie within the union of several linear subspaces. SETKMC integrates the nonconvex tensor norm, self-paced learning, and kernel trick into a unified model for multi-view subspace clustering. The nonconvex tensor norm imposes different weights on different singular values. The self-paced learning gradually involves instances from more reliable to less reliable ones while the kernel trick aims to handle the multi-view data in nonlinear subspaces. One iterative algorithm is proposed based on the alternating direction method of multipliers. Extensive results on seven real-world datasets show the effectiveness of the proposed SETKMC compared to fifteen state-of-the-art multi-view clustering methods.
Published in: IEEE Transactions on Multimedia ( Volume: 24)
Page(s): 4054 - 4066
Date of Publication: 22 September 2021

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.