Processing math: 100%
Deep Clustering via Weighted --Subspace Network | IEEE Journals & Magazine | IEEE Xplore

Deep Clustering via Weighted k-Subspace Network


Abstract:

Subspace clustering aims to separate the data into clusters under the hypothesis that the samples within the same cluster will lie in the same low-dimensional subspace. D...Show More

Abstract:

Subspace clustering aims to separate the data into clusters under the hypothesis that the samples within the same cluster will lie in the same low-dimensional subspace. Due to the tough pairwise constraints, k-subspace clustering is sensitive to outliers and initialization. In this letter, we present a novel deep architecture for k-subspace clustering to address this issue, called as Deep Weighted k-Subspace Clustering (DWSC). Specifically, our framework consists of autoencoder and weighted k-subsapce network. We first use the autoencoder to non-linearly compress the samples into the low-dimensional latent space. In the weighted k-subspace network, we feed the latent representation into the assignment network to output soft assignments which represent the probability of data belonging to the according subspace. Subsequently, the optimal k subspaces are identified by minimizing the projection residuals of the latent representations to all subspaces, using the learned soft assignments as a weighting vector. Finally, we jointly optimize the representation learning and clustering in a unified framework. Experimental results show that our approach outperforms the state-of-the-art subspace clustering methods on two benchmark datasets.
Published in: IEEE Signal Processing Letters ( Volume: 26, Issue: 11, November 2019)
Page(s): 1628 - 1632
Date of Publication: 13 September 2019

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.