Elsevier

Signal Processing

Volume 108, March 2015, Pages 259-271
Signal Processing

Model based Bayesian compressive sensing via Local Beta Process

https://doi.org/10.1016/j.sigpro.2014.09.018Get rights and content

Highlights

  • This paper is dealing with the recovery problem for model based compressive sensing.

  • This paper has proposed a hierarchical Bayesian model to describe the model based compressive sensing.

  • Local Beta Process has been applied to describe the inherent structures of the sparse signals.

  • Variational Bayesian approach has been exploited to implement the Bayesian inference.

Abstract

In the framework of Compressive Sensing (CS), the inherent structures underlying sparsity patterns can be exploited to promote the reconstruction accuracy and robustness. And this consideration results in a new extension for CS, called model based CS. In this paper, we propose a general statistical framework for model based CS, where both sparsity and structure priors are considered simultaneously. By exploiting the Latent Variable Analysis (LVA), a sparse signal is split into weight variables representing values of elements and latent variables indicating labels of elements. Then the Gamma-Gaussian model is exploited to describe weight variables to induce sparsity, while the beta process is assumed on each of the local clusters to describe inherent structures. Since the complete model is an extension of Bayesian CS and the process is for local properties, it is called Model based Bayesian CS via Local Beta Process (MBCS-LBP). Moreover, the beta process is a Bayesian conjugate prior to the Bernoulli Process, as well as the Gamma to Gaussian distribution, thus it allows for an analytical posterior inference through a variational Bayes inference algorithm and hence leads to a deterministic VB-EM iterative algorithm.

Introduction

Compressive Sensing (CS) provides a new sampling paradigm that allows signals to be sampled with sub-Nyquist rate (2 times of signals׳ Fourier bandwidth) [1]. In the framework of CS, samples are collected via a linear projection satisfying restricted isometry property, and provided that signals are sparse (itself or under transform), the exact recovery can be guaranteed theoretically [2]. The reconstruction of signals is often casted as the following regularized optimization problem:θ=argminθ12yΦθ22+λ·ψ(θ)with θRn being the sparse signals, ΦRm×n the sensing matrix and yRm the collected samples. ψ:RnR+ is a regularization term that induces the sparsity of solutions, e.g. the ℓp norm with p[0,1]. λ>0 is a constant that balances distortion and sparsity.

Exploiting sparse regularization term, lots of algorithms have been proposed recently such as Linear Programming (Basis Pursuit) methods [3] and iterative soft thresholding algorithm [4] with p=1, greedy methods [5], [6] and iterative hard thresholding algorithm [7] with p=0, iterative re-weighted least squares regression [8] and Bayesian methods [9], [10] that can be considered to solve the regularized optimization problem with p(0,1].

On the other hand, besides the sparse prior, structural constraint on the support of the sparse signals has been found in many practical applications. The most common example is the wavelet coefficients that can be considered as a tree-structured sparse vector [11], [12] due to the relationship between two conjuncted wavelet scales. Another application is ECG telemonitoring where Zhang et al. [13] exploited the block structure on the ECG signal to improve the compression performance. Similar block/cluster structure has also been exploited in ISAR imaging where the support constraints are due to the continuity of the target scene [14], [15], [16] and image denoising where the similar patches share the same sparsity patterns [17].

Theoretically, it can be proven that the introduction of structural priori largely improves the reconstruction performance [18], [19]. Considering structured sparse signal model leads to an extension to CS, i.e. Model-based Compressive Sensing (MCS) [19]. Accordingly, many new algorithms are proposed by imposing structural sparse regularization on (1), for instance the mixed 1,2 norm that has been widely employed to cope with problems with block structured sparsity [20] or multiple measurement vector model [17]. Other type of approaches has been exploited to cope with different structures. Group-structured extension to OMP [21], Block-OMP [22], Tree-based OMP [19], etc. are OMP-based approaches that proposed recently. However, the above methods are parametric approaches that require to set structure-related parameters at first. Another type of approaches is based on Bayesian CS framework [9], including Tree-based Bayesian CS [11], Cluss-MCMC/-VB [23], [24], BSBL [25] and so on. The Bayesian approaches are non-parametric, but only specific to one type of structures. More recently, a general model for structured sparse signals is proposed via the Boltzmann machine [26], [27], while the interaction matrix for the structural model should be set or pre-estimated.

In this paper, we propose a general statistical framework for model based CS, where both sparsity and structure priors are considered simultaneously. By exploiting the Latent Variable Analysis (LVA), a sparse signal is split into weight variables representing values of elements and latent variables indicating labels of elements. Then we assume that weight variables obey a Gamma-Gaussian model to induce the sparsity. On the other hand, according to the inherent structures, the latent variables can be described by a graph with local clusters, thus a beta-Bernoulli process is assumed on each of the local clusters to describe the properties of structures. Since the complete model is an extension of Bayesian CS and the process is for local properties, it is called Model based Bayesian CS via Local Beta Process (MBCS-LBP). Moreover, the beta process is a Bayesian conjugate prior to the Bernoulli Process, as well as the Gamma to Gaussian distribution, thus it allows for an analytical posterior inference through a variational Bayes inference algorithm and hence leads to a deterministic VB-EM iterative algorithm.

The rest of paper is organized as follows. In Section 2, we briefly review the framework of Bayesian CS. Then the proposed structured sparsity model is presented in Section 3. After that, the posterior inference through the variational Bayes approach is given in Section 4. Experiments are carried out in Section 5 to verify superior performance of the proposed algorithm. The paper ends up with a conclusion.

Section snippets

Bayesian compressive sensing

The canonical form of CS could be written as follows:y=Φθ+ϵwhere ΦRm×n is the sensing matrix satisfying the so-called RIP [2], θRn is the original sparse signal, yRm is the compressed measurement and ϵ is the possible noise or perturbations. Note that mn to ensure the sufficient compression, thus the reconstruction for θ is degenerated into an underdetermined linear inverse problem. Assuming a white noise for ϵ with variance σ0=α01, one can easily obtain a Gaussian likelihood model on the

Model based Bayesian CS via Local Beta Process

In a probabilistic Bayesian approach, through Graphical Models (GMs) [31], latent variables are often exploited to describe the dependencies (or joint probability distributions) between observations and parameters. This kind of method is usually called latent variable analysis [32] and widely used in document categorization. By imposing relations underlying the sparse pattern, structural dependencies between the sparse coefficients can be conveniently considered in a Bayesian model by latent

Variational Bayesian inference

Generally, in Bayesian framework, all variables in the probabilistic model are assigned to be drawn from some distributions. They are often partitioned into the observed data, denoted y, the hidden variables, denoted x and the parameters1 denoted Θ. Then the probabilistic model could be given by p(x,y,Θ|M) with M being the

Numerical experiments

The purpose of these experiments is to demonstrate the advantage of the proposed framework. Particularly, comparisons are made to the algorithms without consideration on structures, such as Basis Pursuit (BP), CoSaMP and Bayesian CS (BCS), as well as to the algorithms considering structures, such as cluster based BCS (CluSS-VB proposed in [24]4), BSBL [25]

Conclusion

In this paper, a general Bayesian framework of exploiting structures is proposed, namely MBCS-LBP. Particularly, structures in sparse signals can be considered as local clusters on graphs, i.e. the implicit properties for structures are local and cluster. Consequently, a beta process is employed to model the cluster property for the local structural neighborhood for each coefficient of the sparse vector. Thanks to the conjugacy of the beta process and the Bernoulli process, it results in

Acknowledgments

This work is funded by National Natural Science Foundation of China under Grant no. 61401315 and Fundamental Research Funds for the Central Universities under Grant no. 212-274016. The authors may express many thanks to the anonymous reviewers for their valuable suggestions.

References (39)

  • S. Derin Babacan et al.

    Bayesian compressive sensing using laplace priors

    IEEE Trans. Image Process.

    (2010)
  • L. He et al.

    Exploiting structure in wavelet-based Bayesian compressive sensing

    IEEE Trans. Signal Process.

    (2009)
  • M.F. Duarte, M.B. Wakin, R.G. Baraniuk, Wavelet-domain compressive signal reconstruction using a hidden Markov tree...
  • Zhilin Zhang et al.

    Compressed sensing for energy-efficient wireless telemonitoring of noninvasive fetal ECG via block sparse Bayesian learning

    IEEE Trans. Bio-med. Eng.

    (2013)
  • Lu Wang et al.

    Enhanced ISAR imaging by exploiting the continuity of the target scene

    IEEE Trans. Geosci. Remote Sens.

    (2014)
  • Lifansnm Zhao et al.

    An improved auto-calibration algorithm based on sparse Bayesian learning framework

    IEEE Signal Process. Lett.

    (2013)
  • Lifansnm Zhao et al.

    An autofocus technique for high-resolution inverse synthetic aperture radar imagery

    IEEE Trans. Geosci. Remote Sens.

    (2014)
  • Weisheng Dong et al.

    Nonlocal image restoration with bilateral variance estimationa low-rank approach

    IEEE Trans. Image Process.

    (2013)
  • Thomas Blumensath et al.

    Sampling theorems for signals from the union of finite-dimensional linear subspaces

    IEEE Trans. Inf. Theory

    (2009)
  • Cited by (38)

    • A deep iterative neural network for structured compressed sensing based on generalized pattern-coupled sparse Bayesian learning

      2022, Digital Signal Processing: A Review Journal
      Citation Excerpt :

      In addition, the inter-scale dependencies between coefficients are described in a block-structured sparse CS framework and using block-based Bayesian methods [23,24]. Furthermore, Yu et al. [25] proposed a general statistical framework for model-based CS, which is suitable for characterizing both block and tree structure. In that framework, an extra beta process is assumed on each of local clusters to describe inherent structures.

    • One-step adaptive markov random field for structured compressive sensing

      2019, Signal Processing
      Citation Excerpt :

      To address the lack of adaptiveness, a line of research [11–21] resorts to data-adaptive models without the necessity for training.

    View all citing articles on Scopus
    View full text