Abstract
The classical algorithms based on regularization usually solve sparse optimization problems under the framework of single objective optimization, which combines the sparse term with the loss term. The majority of these algorithms suffer from the setting of regularization parameter or its estimation. To overcome this weakness, the extension of multiobjective evolutionary algorithm based on decomposition (MOEA/D) has been studied for sparse optimization. The major advantages of MOEA/D lie in two aspects: (1) free setting of regularization parameter and (2) detection of true sparsity. Due to the generational mode of MOEA/D, its efficiency for searching the knee region of the Pareto front is not very satisfactory. In this paper, we proposed a new steady-state MOEA/D with the preference to search the region of Pareto front near the true sparse solution. Within each iteration of our proposed algorithm, a local search step is performed to examine a number of solutions with similar sparsity levels in a neighborhood. Our experimental results have shown that the new MOEA/D clearly performs better than its previous version on reconstructing artificial sparse signals.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Donoho, D.: Compressed sensing. IEEE Trans. Image Process. 52(4), 1289–1306 (2006)
Natraajan, B.: Sparse approximation to linear systems. SIAM J. Comput. 24(2), 227–234 (1995)
Davis, G., Mallat, S., Avellaneda, M.: Adaptive greedy approximations. Constr. Approx. 13(1), 57–98 (1991)
Temlyakov, V.: The best m-term approximation and greedy algorithms. Adv. Comput. Math 8(3), 249–265 (1998)
Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. 11(6), 712–731 (2007)
Blumensath, T., Davies, M.: Normalized iterative hard thresholding: guaranteed stability and performance. IEEE J. Sel. Top. Sign. Process. 4(2), 298–309 (2010)
Donoho, D.: De-noising by soft-thresholding. IEEE Trans. Inf. Theory 41(3), 613–627 (1995)
Xu, Z., Chang, X.Y., Xu, F., Zhang, H.: L1/2 regularization: a thresholding representation theory and a fast solver. IEEE Trans. Neural Netw. Learn. Syst. 23(7), 1013–1027 (2012)
Zeng, J., Lin, S., Wang, Y., Xu, Z.: L1/2 regularization: convergence of iterative half thresholding algorithm. IEEE Trans. Sig. Process. 62(9), 2317–2329 (2014)
Li, L., Yao, X., Stolkin, R., Gong, M., He, S.: An evolutionary multiobjective approach to sparse reconstruction. IEEE Trans. Evol. Comput. 18(6), 827–845 (2014)
Li, H., Su, X., Xu, Z., Zhang, Q.: MOEA/D with iterative thresholding algorithm to sparse optimization problems. In: Proceedings of 12th International Conference on Parallel Problem Solving from Nature (PPSN), pp. 93–101 (2012)
Acknowledgments
The authors would like to thank the anonymous reviewers for their constructive comments and suggestions on the original manuscript. This work was supported by the National Science Foundation of China under Grant 61573279, Grant 61175063, and Grant 61473241, Grant 11626252, Grant 11690011, the National Basic Research Program of China under Grant 2017CB329404.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Li, H., Sun, J., Fan, Y., Wang, M., Zhang, Q. (2018). A New Steady-State MOEA/D for Sparse Optimization. In: Chao, F., Schockaert, S., Zhang, Q. (eds) Advances in Computational Intelligence Systems. UKCI 2017. Advances in Intelligent Systems and Computing, vol 650. Springer, Cham. https://doi.org/10.1007/978-3-319-66939-7_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-66939-7_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-66938-0
Online ISBN: 978-3-319-66939-7
eBook Packages: EngineeringEngineering (R0)