Abstract
Mix sparse structure, namely the sparse structure appearing in the inter-group and intra-group manners simultaneously, is inherited in a wide class of practical applications. Hard thresholding pursuit (HTP) is a practical and efficient algorithm for solving a least square problem with cardinality constraint. In this paper, we propose an algorithm based on HTP to solve a constrained mix sparse optimization problem, named MixHTP, and establish its linear convergence property under the restricted isometry property. Moreover, we apply the MixHTP to compressive sensing with simulated data and enhanced indexation with real data. Numerical results exhibit an excellent performance of MixHTP on approaching a solution with mix sparse structure and MixHTP outperforms several state-of-the-art algorithms in the literature.







Similar content being viewed by others
Data availability
The authors will provide the test data if it is requested.
Notes
GroupHTPC is implemented for group sparse optimization (2) under constraint \(\mathcal {C}\) by removing the thresholding operator \(\mathcal {H}(x;s)\) in MixHTP.
HTPC is implemented for sparse optimization (1) under constraint \(\mathcal {C}\) by removing the thresholding operator \(\mathcal {H}_G(x;S)\) in MixHTP.
References
Baraniuk, R.G., Cevher, V., Duarte, M.F., Hegde, C.: Model-based compressive sensing. IEEE Trans. Inf. Theory 56(4), 1982–2001 (2010)
Beasley, J.E., Meade, N., Chang, T.J.: An evolutionary heuristic for the index tracking problem. Eur. J. Oper. Res. 148(3), 621–643 (2003)
Belloni, A., Chernozhukov, V.: Least squares after model selection in high-dimensional sparse models. Bernoulli 19(2), 521–547 (2013)
Benidis, K., Feng, Y., Palomar, D.P.: Sparse portfolios for high-dimensional financial index tracking. IEEE Trans. Signal Process. 66(1), 155–170 (2017)
Bertsekas, D., Nedic, A., Ozdaglar, A.: Convex Analysis and Optimization. Athena Scientific (2003)
Bian, W., Wu, F.: Accelerated smoothing hard thresholding algorithms for \(\ell _0\) regularized nonsmooth convex regression problem. J. Sci. Comput. 96(2), 33 (2023)
Blanchard, J.D., Cermak, M., Hanle, D., Jing, Y.: Greedy algorithms for joint sparse recovery. IEEE Trans. Signal Process. 62(7), 1694–1704 (2014)
Blumensath, T., Davies, M.E.: Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27(3), 265–274 (2009)
Cai, T.T., Wang, L., Xu, G.: New bounds for restricted isometry constants. IEEE Trans. Inf. Theory 56(9), 4388–4394 (2010)
Cai, T.T., Xu, G., Zhang, J.: On recovery of sparse signals via \(\ell _1\) minimization. IEEE Trans. Inf. Theory 55(7), 3388–3397 (2009)
Canakgoz, N.A., Beasley, J.E.: Mixed-integer programming approaches for index tracking and enhanced indexation. Eur. J. Oper. Res. 196(1), 384–399 (2009)
Candes, E.J., Romberg, J.K., Tao, T.: Stable signal recovery from incomplete and inaccurate measurements. Commun. Pure Appl. Math. J. Issued Courant Inst. Math. Sci. 59(8), 1207–1223 (2006)
Candes, E.J., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)
Chen, J., Dai, G., Zhang, N.: An application of sparse-group lasso regularization to equity portfolio optimization and sector selection. Ann. Oper. Res. 284(1), 243–262 (2020)
Eldar, Y.C., Kuppinger, P., Bolcskei, H.: Block-sparse signals: uncertainty relations and efficient recovery. IEEE Trans. Signal Process. 58(6), 3042–3054 (2010)
Eldar, Y.C., Mishali, M.: Robust recovery of signals from a structured union of subspaces. IEEE Trans. Inf. Theory 55(11), 5302–5316 (2009)
Foucart, S.: Hard thresholding pursuit: an algorithm for compressive sensing. SIAM J. Numer. Anal. 49(6), 2543–2563 (2011)
Foucart, S.: Sparse recovery algorithms: sufficient conditions in terms of restricted isometry constants. In: Approximation Theory XIII: San Antonio 2010, pp. 65–77. Springer (2012)
Foucart, S., Rauhut, H.: A Mathematical Introduction to Compressive Sensing. Applied and Numerical Harmonic Analysis, Springer, New York (2013)
Fu, A., Narasimhan, B., Boyd, S.: CVXR: an R package for disciplined convex optimization. J. Stat. Softw. 94, 1–34 (2020)
Hastie, T., Tibshirani, R., Wainwright, M.: Statistical Learning with Sparsity: The Lasso and Generalizations. CRC Press (2015)
Hu, Y., Hu, X., Yang, X.: On convergence of iterative thresholding algorithms to approximate sparse solution for composite nonconvex optimization. Math. Program. 1–26 (2024)
Hu, Y., Li, C., Meng, K., Qin, J., Yang, X.: Group sparse optimization via \(\ell _{p, q}\) regularization. J. Mach. Learn. Res. 18(1), 960–1011 (2017)
Huang, J., Zhang, T.: The benefit of group sparsity. Ann. Stat. 38(4), 1978–2004 (2010)
Jain, P., Rao, N., Dhillon, I.S.: Structured sparse regression via greedy hard thresholding. Adv. Neural Inf. Process. Syst. 29 (2016)
Kan, R., Smith, D.R.: The distribution of the sample minimum-variance frontier. Manag. Sci. 54(7), 1364–1380 (2008)
Kan, R., Zhou, G.: Optimal portfolio choice with parameter uncertainty. J. Financ. Quant. Anal. 42(3), 621–656 (2007)
Majumdar, A.: Iterative re-weighted least squares algorithms for non-negative sparse and group-sparse recovery. In: ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4423–4427. IEEE (2022)
Meinshausen, N., Yu, B.: Lasso-type recovery of sparse representations for high-dimensional data. Ann. Stat. 37(1), 246–270 (2009)
Needell, D., Tropp, J.A.: CoSaMP: iterative signal recovery from incomplete and inaccurate samples. Appl. Comput. Harmon. Anal. 26(3), 301–321 (2009)
Nguyen, T.T., Idier, J., Soussen, C., Djermoune, E.H.: Non-negative orthogonal greedy algorithms. IEEE Trans. Signal Process. 67(21), 5643–5658 (2019)
O’Hanlon, K., Nagano, H., Keriven, N., Plumbley, M.D.: Non-negative group sparsity with subspace note modelling for polyphonic transcription. IEEE/ACM Trans. Audio Speech Lang. Process. 24(3), 530–542 (2016)
Qi, R., Yang, D., Zhang, Y., Li, H.: On recovery of block sparse signals via block generalized orthogonal matching pursuit. Signal Process. 153, 34–46 (2018)
Qin, J., Hu, Y., Xu, F., Yalamanchili, H.K., Wang, J.: Inferring gene regulatory networks by integrating ChIP-seq/chip and transcriptome data via lasso-type regularization methods. Methods 67(3), 294–303 (2014)
Sharma, A., Mehra, A.: Financial analysis based sectoral portfolio optimization under second order stochastic dominance. Ann. Oper. Res. 256(1), 171–197 (2017)
Shu, L., Shi, F., Tian, G.: High-dimensional index tracking based on the adaptive elastic net. Quant. Finance 20(9), 1513–1530 (2020)
Simon, N., Friedman, J., Hastie, T., Tibshirani, R.: A sparse-group lasso. J. Comput. Graph. Stat. 22(2), 231–245 (2013)
Tao, M., Zhang, X.P.: Study on L1 over L2 minimization for nonnegative signal recovery. J. Sci. Comput. 95(3), 94 (2023)
Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B (Methodol.) 58(1), 267–288 (1996)
Wainwright, M.J.: High-Dimensional Statistics: A Non-asymptotic Viewpoint. Cambridge University Press (2019)
Xu, F., Ma, J., Lu, H.: Group sparse enhanced indexation model with adaptive beta value. Quant. Finance 22(10), 1905–1926 (2022)
Xu, Z., Chang, X., Xu, F., Zhang, H.: \(l_{1/2}\) regularization: a thresholding representation theory and a fast solver. IEEE Trans. Neural Netw. Learn. Syst. 23(7), 1013–1027 (2012)
Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 68(1), 49–67 (2006)
Zhang, T.: Adaptive forward-backward greedy algorithm for learning sparse representations. IEEE Trans. Inf. Theory 57(7), 4689–4708 (2011)
Zhang, T.: Sparse recovery with orthogonal matching pursuit under RIP. IEEE Trans. Inf. Theory 57(9), 6215–6221 (2011)
Zhang, X., Zhang, X.: A new proximal iterative hard thresholding method with extrapolation for \(\ell _0\) minimization. J. Sci. Comput. 79(2), 809–826 (2019)
Zhao, Z., Xu, F., Wang, M., Zhang, C.: A sparse enhanced indexation model with \(\ell _{1/2}\) norm and its alternating quadratic penalty method. J. Oper. Res. Soc. 70(3), 433–445 (2019)
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no Conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work is supported by National Natural Science Foundation of China (12222112, 12071306, 32170655), Project of Educational Commission of Guangdong Province (2023ZDZX1017), Shenzhen Science and Technology Program (RCJC20221008092753082, RCYX20231211090222026), Research Team Cultivation Program of Shenzhen University (2023QNT011), Guangdong Basic and Applied Basic Research Foundation (2023A1515012395) and Research Grants Council of Hong Kong (PolyU 15217520).
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Hu, X., Hu, Y., Yang, X. et al. Constrained Mix Sparse Optimization via Hard Thresholding Pursuit. J Sci Comput 101, 55 (2024). https://doi.org/10.1007/s10915-024-02682-3
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10915-024-02682-3
Keywords
- Hard thresholding pursuit
- Mix sparse structure
- Restricted isometry property
- Convergence property
- Enhanced indexation