Skip to main content
Log in

Constrained Mix Sparse Optimization via Hard Thresholding Pursuit

  • Published:
Journal of Scientific Computing Aims and scope Submit manuscript

Abstract

Mix sparse structure, namely the sparse structure appearing in the inter-group and intra-group manners simultaneously, is inherited in a wide class of practical applications. Hard thresholding pursuit (HTP) is a practical and efficient algorithm for solving a least square problem with cardinality constraint. In this paper, we propose an algorithm based on HTP to solve a constrained mix sparse optimization problem, named MixHTP, and establish its linear convergence property under the restricted isometry property. Moreover, we apply the MixHTP to compressive sensing with simulated data and enhanced indexation with real data. Numerical results exhibit an excellent performance of MixHTP on approaching a solution with mix sparse structure and MixHTP outperforms several state-of-the-art algorithms in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data availability

The authors will provide the test data if it is requested.

Notes

  1. MixIHT is an application of MixHTP by removing the pursuit step (i.e., line 9 in Algorithm 1), whose idea is inspired from [8, 25] and convergence theorem can be proved by following a line of analysis similar to Theorem 1.

  2. GroupHTPC is implemented for group sparse optimization (2) under constraint \(\mathcal {C}\) by removing the thresholding operator \(\mathcal {H}(x;s)\) in MixHTP.

  3. https://en.wikipedia.org/wiki/List_of_S%26P_500_companies

  4. HTPC is implemented for sparse optimization (1) under constraint \(\mathcal {C}\) by removing the thresholding operator \(\mathcal {H}_G(x;S)\) in MixHTP.

References

  1. Baraniuk, R.G., Cevher, V., Duarte, M.F., Hegde, C.: Model-based compressive sensing. IEEE Trans. Inf. Theory 56(4), 1982–2001 (2010)

    MathSciNet  Google Scholar 

  2. Beasley, J.E., Meade, N., Chang, T.J.: An evolutionary heuristic for the index tracking problem. Eur. J. Oper. Res. 148(3), 621–643 (2003)

    MathSciNet  Google Scholar 

  3. Belloni, A., Chernozhukov, V.: Least squares after model selection in high-dimensional sparse models. Bernoulli 19(2), 521–547 (2013)

    MathSciNet  Google Scholar 

  4. Benidis, K., Feng, Y., Palomar, D.P.: Sparse portfolios for high-dimensional financial index tracking. IEEE Trans. Signal Process. 66(1), 155–170 (2017)

    MathSciNet  Google Scholar 

  5. Bertsekas, D., Nedic, A., Ozdaglar, A.: Convex Analysis and Optimization. Athena Scientific (2003)

    Google Scholar 

  6. Bian, W., Wu, F.: Accelerated smoothing hard thresholding algorithms for \(\ell _0\) regularized nonsmooth convex regression problem. J. Sci. Comput. 96(2), 33 (2023)

    Google Scholar 

  7. Blanchard, J.D., Cermak, M., Hanle, D., Jing, Y.: Greedy algorithms for joint sparse recovery. IEEE Trans. Signal Process. 62(7), 1694–1704 (2014)

    MathSciNet  Google Scholar 

  8. Blumensath, T., Davies, M.E.: Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27(3), 265–274 (2009)

    MathSciNet  Google Scholar 

  9. Cai, T.T., Wang, L., Xu, G.: New bounds for restricted isometry constants. IEEE Trans. Inf. Theory 56(9), 4388–4394 (2010)

    MathSciNet  Google Scholar 

  10. Cai, T.T., Xu, G., Zhang, J.: On recovery of sparse signals via \(\ell _1\) minimization. IEEE Trans. Inf. Theory 55(7), 3388–3397 (2009)

    Google Scholar 

  11. Canakgoz, N.A., Beasley, J.E.: Mixed-integer programming approaches for index tracking and enhanced indexation. Eur. J. Oper. Res. 196(1), 384–399 (2009)

    MathSciNet  Google Scholar 

  12. Candes, E.J., Romberg, J.K., Tao, T.: Stable signal recovery from incomplete and inaccurate measurements. Commun. Pure Appl. Math. J. Issued Courant Inst. Math. Sci. 59(8), 1207–1223 (2006)

    MathSciNet  Google Scholar 

  13. Candes, E.J., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)

    MathSciNet  Google Scholar 

  14. Chen, J., Dai, G., Zhang, N.: An application of sparse-group lasso regularization to equity portfolio optimization and sector selection. Ann. Oper. Res. 284(1), 243–262 (2020)

    MathSciNet  Google Scholar 

  15. Eldar, Y.C., Kuppinger, P., Bolcskei, H.: Block-sparse signals: uncertainty relations and efficient recovery. IEEE Trans. Signal Process. 58(6), 3042–3054 (2010)

    MathSciNet  Google Scholar 

  16. Eldar, Y.C., Mishali, M.: Robust recovery of signals from a structured union of subspaces. IEEE Trans. Inf. Theory 55(11), 5302–5316 (2009)

    MathSciNet  Google Scholar 

  17. Foucart, S.: Hard thresholding pursuit: an algorithm for compressive sensing. SIAM J. Numer. Anal. 49(6), 2543–2563 (2011)

    MathSciNet  Google Scholar 

  18. Foucart, S.: Sparse recovery algorithms: sufficient conditions in terms of restricted isometry constants. In: Approximation Theory XIII: San Antonio 2010, pp. 65–77. Springer (2012)

  19. Foucart, S., Rauhut, H.: A Mathematical Introduction to Compressive Sensing. Applied and Numerical Harmonic Analysis, Springer, New York (2013)

    Google Scholar 

  20. Fu, A., Narasimhan, B., Boyd, S.: CVXR: an R package for disciplined convex optimization. J. Stat. Softw. 94, 1–34 (2020)

    Google Scholar 

  21. Hastie, T., Tibshirani, R., Wainwright, M.: Statistical Learning with Sparsity: The Lasso and Generalizations. CRC Press (2015)

    Google Scholar 

  22. Hu, Y., Hu, X., Yang, X.: On convergence of iterative thresholding algorithms to approximate sparse solution for composite nonconvex optimization. Math. Program. 1–26 (2024)

  23. Hu, Y., Li, C., Meng, K., Qin, J., Yang, X.: Group sparse optimization via \(\ell _{p, q}\) regularization. J. Mach. Learn. Res. 18(1), 960–1011 (2017)

    MathSciNet  Google Scholar 

  24. Huang, J., Zhang, T.: The benefit of group sparsity. Ann. Stat. 38(4), 1978–2004 (2010)

    MathSciNet  Google Scholar 

  25. Jain, P., Rao, N., Dhillon, I.S.: Structured sparse regression via greedy hard thresholding. Adv. Neural Inf. Process. Syst. 29 (2016)

  26. Kan, R., Smith, D.R.: The distribution of the sample minimum-variance frontier. Manag. Sci. 54(7), 1364–1380 (2008)

    Google Scholar 

  27. Kan, R., Zhou, G.: Optimal portfolio choice with parameter uncertainty. J. Financ. Quant. Anal. 42(3), 621–656 (2007)

    Google Scholar 

  28. Majumdar, A.: Iterative re-weighted least squares algorithms for non-negative sparse and group-sparse recovery. In: ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4423–4427. IEEE (2022)

  29. Meinshausen, N., Yu, B.: Lasso-type recovery of sparse representations for high-dimensional data. Ann. Stat. 37(1), 246–270 (2009)

    MathSciNet  Google Scholar 

  30. Needell, D., Tropp, J.A.: CoSaMP: iterative signal recovery from incomplete and inaccurate samples. Appl. Comput. Harmon. Anal. 26(3), 301–321 (2009)

    MathSciNet  Google Scholar 

  31. Nguyen, T.T., Idier, J., Soussen, C., Djermoune, E.H.: Non-negative orthogonal greedy algorithms. IEEE Trans. Signal Process. 67(21), 5643–5658 (2019)

    Google Scholar 

  32. O’Hanlon, K., Nagano, H., Keriven, N., Plumbley, M.D.: Non-negative group sparsity with subspace note modelling for polyphonic transcription. IEEE/ACM Trans. Audio Speech Lang. Process. 24(3), 530–542 (2016)

    Google Scholar 

  33. Qi, R., Yang, D., Zhang, Y., Li, H.: On recovery of block sparse signals via block generalized orthogonal matching pursuit. Signal Process. 153, 34–46 (2018)

    Google Scholar 

  34. Qin, J., Hu, Y., Xu, F., Yalamanchili, H.K., Wang, J.: Inferring gene regulatory networks by integrating ChIP-seq/chip and transcriptome data via lasso-type regularization methods. Methods 67(3), 294–303 (2014)

    Google Scholar 

  35. Sharma, A., Mehra, A.: Financial analysis based sectoral portfolio optimization under second order stochastic dominance. Ann. Oper. Res. 256(1), 171–197 (2017)

    MathSciNet  Google Scholar 

  36. Shu, L., Shi, F., Tian, G.: High-dimensional index tracking based on the adaptive elastic net. Quant. Finance 20(9), 1513–1530 (2020)

    MathSciNet  Google Scholar 

  37. Simon, N., Friedman, J., Hastie, T., Tibshirani, R.: A sparse-group lasso. J. Comput. Graph. Stat. 22(2), 231–245 (2013)

    MathSciNet  Google Scholar 

  38. Tao, M., Zhang, X.P.: Study on L1 over L2 minimization for nonnegative signal recovery. J. Sci. Comput. 95(3), 94 (2023)

    Google Scholar 

  39. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B (Methodol.) 58(1), 267–288 (1996)

    MathSciNet  Google Scholar 

  40. Wainwright, M.J.: High-Dimensional Statistics: A Non-asymptotic Viewpoint. Cambridge University Press (2019)

    Google Scholar 

  41. Xu, F., Ma, J., Lu, H.: Group sparse enhanced indexation model with adaptive beta value. Quant. Finance 22(10), 1905–1926 (2022)

    MathSciNet  Google Scholar 

  42. Xu, Z., Chang, X., Xu, F., Zhang, H.: \(l_{1/2}\) regularization: a thresholding representation theory and a fast solver. IEEE Trans. Neural Netw. Learn. Syst. 23(7), 1013–1027 (2012)

    Google Scholar 

  43. Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 68(1), 49–67 (2006)

    MathSciNet  Google Scholar 

  44. Zhang, T.: Adaptive forward-backward greedy algorithm for learning sparse representations. IEEE Trans. Inf. Theory 57(7), 4689–4708 (2011)

    MathSciNet  Google Scholar 

  45. Zhang, T.: Sparse recovery with orthogonal matching pursuit under RIP. IEEE Trans. Inf. Theory 57(9), 6215–6221 (2011)

    MathSciNet  Google Scholar 

  46. Zhang, X., Zhang, X.: A new proximal iterative hard thresholding method with extrapolation for \(\ell _0\) minimization. J. Sci. Comput. 79(2), 809–826 (2019)

    MathSciNet  Google Scholar 

  47. Zhao, Z., Xu, F., Wang, M., Zhang, C.: A sparse enhanced indexation model with \(\ell _{1/2}\) norm and its alternating quadratic penalty method. J. Oper. Res. Soc. 70(3), 433–445 (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kai Zhang.

Ethics declarations

Conflict of interest

The authors declare that they have no Conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work is supported by National Natural Science Foundation of China (12222112, 12071306, 32170655), Project of Educational Commission of Guangdong Province (2023ZDZX1017), Shenzhen Science and Technology Program (RCJC20221008092753082, RCYX20231211090222026), Research Team Cultivation Program of Shenzhen University (2023QNT011), Guangdong Basic and Applied Basic Research Foundation (2023A1515012395) and Research Grants Council of Hong Kong (PolyU 15217520).

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hu, X., Hu, Y., Yang, X. et al. Constrained Mix Sparse Optimization via Hard Thresholding Pursuit. J Sci Comput 101, 55 (2024). https://doi.org/10.1007/s10915-024-02682-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10915-024-02682-3

Keywords

Mathematics Subject Classification