Skip to main content
Log in

\(\ell _{1/2,1}\) group sparse regularization for compressive sensing

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Recently, the design of group sparse regularization has drawn much attention in group sparse signal recovery problem. Two of the most popular group sparsity-inducing regularization models are \(\ell _{1,2}\) and \(\ell _{1,\infty }\) regularization. Nevertheless, they do not promote the intra-group sparsity. For example, Huang and Zhang (Ann Stat 38:1978–2004, 2010) claimed that the \(\ell _{1,2}\) regularization is superior to the \(\ell _1\) regularization only for strongly group sparse signals. This means the sparsity of intra-group is useless for \(\ell _{1,2}\) regularization. Our experiments show that recovering signals with intra-group sparse needs more measurements than those without, by the \(\ell _{1,\infty }\) regularization. In this paper, we propose a novel group sparsity-inducing regularization defined as a mixture of the \(\ell _{1/2}\) norm and the \(\ell _{1}\) norm, referred to as \(\ell _{1/2,1}\) regularization, which can overcome these shortcomings of \(\ell _{1,2}\) and \(\ell _{1,\infty }\) regularization. We define a new null space property for \(\ell _{1/2,1}\) regularization and apply it to establish a recoverability theory for both intra-group and inter-group sparse signals. In addition, we introduce an iteratively reweighted algorithm to solve this model and analyze its convergence. Comprehensive experiments on simulated data show that the proposed \(\ell _{1/2,1}\) regularization is superior to \(\ell _{1,2}\) and \(\ell _{1,\infty }\) regularization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. The group structure on the signal \(\varvec{x}\) can be overlapping or nonoverlapping, for simplicity, we only consider the nonoverlapping case.

  2. The FISTA is not guaranteed to be a descent algorithm; instead, one can apply a variant of FISTA proposed in [6] for further enhancing the performances.

References

  1. Donoho, D.: Compressed sensing. IEEE Trans. Inf. Theory 52, 1289–1306 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  2. Candès, E., Romberg, J., Tao, T.: Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52, 489–509 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  3. Candès, E., Tao, T.: Near optimal signal recovery from random projections: Universal encoding strategies? IEEE Trans. Inf. Theory 52, 5406–5425 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  4. Baraniuk, R.G.: Compressive sensing. IEEE Signal Process. Mag. 24, 118–121 (2007)

    Article  Google Scholar 

  5. Kose, K., Gunay, O., Cetin, A.E.: Compressive sensing using the modified entropy functional. Digit. Signal Proc. 24, 63–70 (2014)

    Article  MathSciNet  Google Scholar 

  6. Karahanoglu, N.B., Erdogan, H.: Compressed sensing signal recovery via forward–backward pursuit. Digit. Signal Proc. 23, 1539–1548 (2013)

    Article  MathSciNet  Google Scholar 

  7. Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. Roy. Stat. Soc. B 68, 49–67 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  8. Meier, L., Geer, S.V.D., Buhlmann, P.: The group lasso for logistic regression. J. Roy. Stat. Soc. B 70, 53–71 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  9. Baraniuk, R.G., Cevher, V., Duarte, M.F., Hegde, C.: Model-based compressed sensing. IEEE Trans. Inf. Theory 56, 1982–2001 (2010)

    Article  MathSciNet  Google Scholar 

  10. Friedman, J., Hastie, T., Tibshirani, R.: A note on the group lasso and a sparse group lass. arXiv:1001.0736 [math.ST] (2010)

  11. Jenatton, R., Mairal, J., Obozinski, G., Bach, F.: Proximal methods for hierarchical sparse coding. J. Mach. Learn. Res. 12, 2297–2334 (2011)

    MathSciNet  MATH  Google Scholar 

  12. Jenatton, R., Gramfort, A., Michel, V., et al.: Multiscale mining of fMRI data with hierarchical structured sparsity. SIAM J. Image Sci. 5, 835–856 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  13. Sprechmann, P., Ramirez, I., Sapiro, G., Eldar, Y.C.: C-HiLasso: a collaborative Hierarchical sparse modeling framework. IEEE Trans. Signal Process. 59, 4183–4198 (2011)

    Article  MathSciNet  Google Scholar 

  14. Jenatton, R., Audibert, J.Y., Bach, F.: Structured variable selection with sparsity-inducing norms. J. Mach. Learn. Res. 12, 2777–2824 (2011)

    MathSciNet  MATH  Google Scholar 

  15. Shuo, X., Tong, X., Ye, J.: Efficient sparse group feature selection via nonconvex optimization. In: Proceedings of the 30th International Conference on Machine Learning (ICML-13), pp 1–13 (2013)

  16. Angshul, M., Ward, R. K.: Non-Convex Group Sparsity: Application to Color Imaging. ICASSP, pp 469–472 (2010)

  17. Huang, J., Zhang, T., Metaxas, D.: Learning with structured sparsity. J. Mach. Learn. Res. 12, 3371–3412 (2011)

    MathSciNet  MATH  Google Scholar 

  18. Huang, J., Zhang, T.: The benefit of group sparsity. Ann Stat 38, 1978–2004 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  19. Liu, J., Ye, J.: Efficient \(\ell _1/\ell _q\) norm regularization. arXiv:1009.4766 (2010)

  20. Mairal, J., Jenatton, R., Obozinski, G., Bach, F.: Convex and network flow optimization for structured sparsity. J. Mach. Learn. Res. 12, 2681–2720 (2011)

    MathSciNet  MATH  Google Scholar 

  21. Simon, N., Friedman, J., Hastie, T., Tibshirani, R.: A sparse-group lasso. J. Comput. Graph. Stat. 22, 231–245 (2013)

    Article  MathSciNet  Google Scholar 

  22. Xu, Z., Zhang, H., Wang, Y.: \(L_{1/2}\) regularizer. Sci. China Ser. F 53, 1159–1169 (2010)

    Article  MathSciNet  Google Scholar 

  23. Xu, Z., Chang, X., Xu, F.: \(L_{1/2}\) regularization: a thresholding representation theory and a fast solver. IEEE Trans. Neural Netw. Learn. Syst. 23, 1013–1027 (2012)

    Article  Google Scholar 

  24. Stojnic, M., Parvaresh, F., Hassibi, B.: On the reconstruction of block-sparse signals with and optimal number of measurements. IEEE Trans. Signal Process. 57, 3075–3085 (2009)

    Article  MathSciNet  Google Scholar 

  25. Zou, H., Li, R.: One-step sparse estimates in nonconcave penalized likelihood models. Ann. Stat. 36, 1509–1533 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  26. Wright, S.J., Nowak, R.D., Figueiredo, M.A.T.: Sparse reconstruction by separable approximation. IEEE Trans. Signal Process. 7, 2479–2493 (2009)

    Article  MathSciNet  Google Scholar 

  27. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Image Sci. 2, 183–202 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  28. Hunter, D., Li, R.: Variable selection using MM algorithms. Ann. Stat. 33, 1617–1642 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  29. Deng, W., Yin, W., Zhang, Y.: Group sparse optimization by alternating direction method. Rice CAAM report TR11-06 (2011)

Download references

Acknowledgments

The authors would like to thank the editors and two reviewers for their valuable comments and suggestions, which led to a substantial improvement of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Junmin Liu.

Additional information

This work was supported in part by the National Basic Research Program of China (973) (Grant No. 2013CB329404), in part by the National Natural Science Foundation of China (Grant Nos. 11401465, 91230101 and 61572393), the Fundamental Research Funds for the Central Universities (Grant No. xjj2014010), and the Projects funded by China Postdoctoral Science Foundation (Grant No. 2014M560781) and Shaanxi Postdoctoral Science Foundation.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 312 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, S., Zhang, J., Liu, J. et al. \(\ell _{1/2,1}\) group sparse regularization for compressive sensing. SIViP 10, 861–868 (2016). https://doi.org/10.1007/s11760-015-0829-6

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-015-0829-6

Keywords

Navigation