Abstract:
Utilizing random matrix to reduce the dimension of data has become an attractive method in signal processing and machine learning since the boom of Compressed Sensing. On...Show MoreMetadata
Abstract:
Utilizing random matrix to reduce the dimension of data has become an attractive method in signal processing and machine learning since the boom of Compressed Sensing. One important example is compressed subspace clustering (CSC), a powerful unsupervised learning algorithm, which performs subspace clustering after random projection. In our previous work, motivated by the importance of affinity in CSC and the conjecture about whether the similarity (distance) between any two given subspaces can remain almost unchanged after random projection, we first prove the restricted isometry property of Gaussian random matrix for compressing subspaces, providing strong theoretical guarantee for the performance of CSC. However, the estimated probability bound in that work doesn't match well with the forms of RIP in other fields, e.g., compressed sensing, because the analysis skills we use are too coarse. To address this issue, we rigorously derive a nearly optimal probability bound in this paper, which can provide a more solid theoretical foundation for CSC and other subspace related problems.
Published in: 2018 IEEE Data Science Workshop (DSW)
Date of Conference: 04-06 June 2018
Date Added to IEEE Xplore: 19 August 2018
ISBN Information: