Abstract:
Most matrix reconstruction methods assume that missing entries randomly distribute in the incomplete matrix, and the low-rank prior or its variants are used to well pose ...Show MoreMetadata
Abstract:
Most matrix reconstruction methods assume that missing entries randomly distribute in the incomplete matrix, and the low-rank prior or its variants are used to well pose the problem. However, in practical applications, missing entries are structurally rather than randomly distributed, and cannot be handled by the rank minimization prior individually. To remedy this, this paper introduces new matrix reconstruction models using double priors on the latent matrix, named Reweighted Low-rank and Sparsity Priors (ReLaSP). In the proposed ReLaSP models, the matrix is regularized by a low-rank prior to exploit the inter-column and inter-row correlations, and its columns (rows) are regularized by a sparsity prior under a dictionary to exploit intra-column (-row) correlations. Both the low-rank and sparse priors are reweighted on the fly to promote low-rankness and sparsity, respectively. Numerical algorithms to solve our ReLaSP models are derived via the alternating direction method under the augmented Lagrangian multiplier framework. Results on synthetic data, image restoration tasks, and seismic data interpolation show that the proposed ReLaSP models are quite effective in recovering matrices degraded by highly structural missing and various types of noise, complementing the classic matrix reconstruction models that handle random missing only.
Published in: IEEE Transactions on Image Processing ( Volume: 26, Issue: 3, March 2017)