Skip to main content
Log in

Multitask classification and reconstruction using extended Turbo approximate message passing

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Approximate message passing (AMP) is a known compressive sensing (CS) algorithm, owing to it being computationally efficient, and having high performance and a deterministic state evolution (SE) trajectory. Turbo generalized AMP (Turbo-GAMP) was proposed based on AMP, and it was extended to multitask CS with multiple measurement vectors (MMVs). The resulting Turbo-GAMP-MMV can reconstruct multiple structured-sparse signals when they are well correlated. This paper considers the case where the CS tasks belong to various groups and signals from different groups may have weak correlation. We explore the SE property to enhance Turbo-GAMP-MMV for weakly correlated signals. The developed methods first conduct task classification via dividing CS tasks into groups and then reconstruct the original signals from each group jointly. Experiments using synthetic signals and grey images show that the new algorithms outperform several state-of-art benchmark techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Donoho, D.L., Maleki, A., Montanari, A.: Message passing algorithms for compressed sensing: I. Motivation and construction. In: Proceedings of Information Theory Workshop, Cairo, Egypt, pp. 1–5 (2010)

  2. Bayati, M., Montanari, A.: The dynamics of message passing on dense graphs, with applications to compressed sensing. IEEE Trans. Inf. Theory 57, 764–785 (2011)

    Article  MathSciNet  Google Scholar 

  3. Rangan, S.: Generalized approximate message passing for estimation with random linear mixing. In: Proceedings of IEEE International Symposium on Information Theory (ISIT), Saint Petersburg, Russia, pp. 2168–2172 (2011)

  4. Vila, J.P., Schniter, P.: Expectation-maximization Gaussian-mixture approximate message passing. IEEE Trans. Signal Process. 61(19), 4658–4672 (2013)

    Article  MathSciNet  Google Scholar 

  5. Schniter, P.: Turbo reconstruction of structured sparse signals. In: CISS (Princeton) (2010)

  6. Compressive sensing resources: references and software. http://dsp.rice.edu/cs

  7. Ziniel, J., Schniter, P.: Efficient high-dimensional inference in the multiple measurement vector problem (2011). arXiv:1111.5272 [cs.IT]

  8. Ziniel, J., Rangan, S., Schniter, P.: A generalized framework for learning and recovery of structured sparse signals. In: Proceedings of Ann Arbor, MI, USA, Aug, IEEE Workshop Statistical Signal Processing (2012)

  9. Ji, S., Dunson, D., Carin, L.: Multitask compressive sensing. IEEE Trans. Signal Process. 57(1), 92–106 (2009)

    Article  MathSciNet  Google Scholar 

  10. Wang, Y., Yang, L., Liu, Z., Jiang, W.: SBL-based multi-task algorithms for recovering block-sparse signals with unknown partitions. EURASIP J. Adv. Signal Process. 2014, 14 (2014). doi:10.1186/1687-6180-2014-14

    Article  Google Scholar 

  11. Zhang, Z., Rao, B.D.: Extension of SBL algorithms for the recovery of block sparse signals with intra-block correlation. to appear in IEEE Trans. Signal Process. arXiv:1201.0862

  12. Wang, Y., Yang, L., Tang, L., Liu, Z., Jiang, W.: Enhanced multi-task compressive sensing using Laplace priors and MDL-based task classification. EURASIP J. Adv. Signal Process. 2013, 160 (2013). doi:10.1186/1687-6180-2013-160

    Article  Google Scholar 

  13. Babacan, S., Molina, R., Katsaggelos, A.: Bayesian compressive sensing using Laplace priors. IEEE Trans. Image Process. 19(1), 53–63 (2010)

    Article  MathSciNet  Google Scholar 

  14. Qi, Y., Liu, D., Carin, L., Dunson, D.: Multi-task compressive sensing with Dirichlet process priors. In: Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland (2008)

  15. Rissanen, J.: Modeling by shortest data description. Automatica 14, 465–471 (1978)

    Article  MATH  Google Scholar 

  16. Rissanen, J.: Universal coding, information, prediction, and estimation. IEEE Trans. Inf. Theory 30(4), 629–636 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  17. Barron, A., Rissanen, J., Yu, B.: The minimum description length principle in coding and modeling. IEEE Trans. Inf. Theory 44(6), 2743–2760 (1998)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Le Yang.

Additional information

This work was supported by the Natural Science Foundation of China (Nos. 61304264 and 61305017).

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 380 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, YG., Yang, L., Tang, ZY. et al. Multitask classification and reconstruction using extended Turbo approximate message passing. SIViP 11, 219–226 (2017). https://doi.org/10.1007/s11760-016-0922-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-016-0922-5

Keywords

Navigation