Skip to main content

Advertisement

Log in

Neurodynamic approaches with derivative feedback for sparse signal reconstruction

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

This paper addresses the reconstruction issue of sparse signal by developing centralized and distributed neurodynamic approaches. An \( l_{1} \)-minimization problem can be employed for reconstructing sparse signal, and for solving this problem, a centralized neurodynamic approach with derivative feedback is designed. Considering the fact that the distributed approaches decompose the large-scale problem into small-scale one without central processing node in centralized ones, it effectively reduces the computational complexity of a single node. According to the distributed consensus and graph theory, the original \( l_{1} \)-minimization problem can be equivalently transformed as a distributed optimization problem. Then, based on our proposed centralized approach, a distributed neurodynamic approach with derivative feedback is further proposed. Through the convex optimization theory and Lyapunov method, we indicate that the optimal solution of \( l_{1} \)-minimization problem is equivalent to the equilibrium point of centralized or distributed approach, and that each neurodynamic approach globally converges to its equilibrium point. Furthermore, by comparing with several state-of-the-art neurodynamic approaches, our proposed approaches demonstrate their effectiveness and superiority by simulation results in reconstructing sparse signals and images.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability

All data generated or analyzed during this study are included in this published article.

References

  1. Malioutov D, Cetin M, Willsky AS (2005) A sparse signal reconstruction perspective for source localization with sensor arrays. IEEE Trans Signal Process 53(8):3010–3022. https://doi.org/10.1109/TSP.2005.850882

    Article  MathSciNet  MATH  Google Scholar 

  2. Tan J, Ma Y, Baron D (2015) Compressive imaging via approximate message passing with image denoising. IEEE Trans Signal Process 63(8):2085–2092. https://doi.org/10.1109/TSP.2015.2408558

    Article  MathSciNet  MATH  Google Scholar 

  3. Ragab M, Omer OA, Abdel-Nasser M (2018) Compressive sensing MRI reconstruction using empirical wavelet transform and grey wolf optimizer. Neural Comput Appl 32:2705–2724. https://doi.org/10.1007/s00521-018-3812-7

    Article  Google Scholar 

  4. Wright J, Yang AY, Ganesh A, Sastry SS, Ma Y (2009) Robust face recognition via sparse representation. IEEE Trans Pattern Anal Mach Intell 31(2):210–227. https://doi.org/10.1109/TPAMI.2008.79

    Article  Google Scholar 

  5. Donoho DL (2006) Compressed sensing. IEEE Trans Inf Theory 52(4):1289–1306. https://doi.org/10.1109/TIT.2006.871582

    Article  MathSciNet  MATH  Google Scholar 

  6. Tropp JA, Gilbert AC (2007) Signal recovery from random measurements via orthogonal matching pursuit. IEEE Trans Inf Theory 53(12):4655–4666. https://doi.org/10.1109/TIT.2007.909108

    Article  MathSciNet  MATH  Google Scholar 

  7. Byrd RH, Hribar ME, Nocedal J (1999) An interior point algorithm for large-scale nonlinear programming. SIAM J Optim 9:877–900

    Article  MathSciNet  MATH  Google Scholar 

  8. Becker S, Bobin J, Candès EJ (2011) Nesta: a fast and accurate first-order method for sparse recovery. SIAM J Imaging Sci 4:1–39

    Article  MathSciNet  MATH  Google Scholar 

  9. Huang B, Zhang H, Gong D, Wang Z (2012) A new result for projection neural networks to solve linear variational inequalities and related optimization problems. Neural Comput Appl 23:357–362. https://doi.org/10.1007/s00521-012-0918-1

    Article  Google Scholar 

  10. Feng J, Qin S, Shi F, Zhao X (2017) A recurrent neural network with finite-time convergence for convex quadratic bilevel programming problems. Neural Comput Appl 30:3399–3408. https://doi.org/10.1007/s00521-017-2926-7

    Article  Google Scholar 

  11. Rozell CJ, Johnson DH, Baraniuk R, Olshausen BA (2008) Sparse coding via thresholding and local competition in neural circuits. Neural Comput 20:2526–2563. https://doi.org/10.1162/neco.2008.03-07-486

    Article  MathSciNet  Google Scholar 

  12. Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B Methodol 58:267–288

    MathSciNet  MATH  Google Scholar 

  13. Feng R, Leung C-S, Constantinides AG, Zeng W-J (2017) Lagrange programming neural network for nondifferentiable optimization problems in sparse approximation. IEEE Trans Neural Netw Learn Syst 28(10):2395–2407. https://doi.org/10.1109/TNNLS.2016.2575860

    Article  MathSciNet  Google Scholar 

  14. Liu Q, Wang J (2016) \(l_{1}\)-minimization algorithms for sparse signal reconstruction based on a projection neural network. IEEE Trans Neural Netw Learn Syst 27(3):698–707. https://doi.org/10.1109/TNNLS.2015.2481006

    Article  MathSciNet  Google Scholar 

  15. Xu B, Liu Q, Huang T (2019) A discrete-time projection neural network for sparse signal reconstruction with application to face recognition. IEEE Trans Neural Netw Learn Syst 30(1):151–162. https://doi.org/10.1109/TNNLS.2018.2836933

    Article  MathSciNet  Google Scholar 

  16. Bian W, Chen X (2012) Smoothing neural network for constrained non-lipschitz optimization with applications. IEEE Trans Neural Netw Learn Syst 23(3):399–411. https://doi.org/10.1109/TNNLS.2011.2181867

    Article  MathSciNet  Google Scholar 

  17. Wang D, Zhang Z (2017) KKT condition-based smoothing recurrent neural network for nonsmooth nonconvex optimization in compressed sensing. Neural Comput Appl 31:2905–2920. https://doi.org/10.1007/s00521-017-3239-6

    Article  Google Scholar 

  18. Zhao Y, He X, Huang T, Huang J, Li P (2020) A smoothing neural network for minimization \( l_{1}-l_{p} \) in sparse signal reconstruction with measurement noises. Neural Netw 122:40–53. https://doi.org/10.1016/j.neunet.2019.10.006

    Article  MATH  Google Scholar 

  19. Xie T, Chen G, Liao X (2019) Event-triggered asynchronous distributed optimization algorithm with heterogeneous time-varying step-sizes. Neural Comput Appl 32:6175–6184. https://doi.org/10.1007/s00521-019-04116-w

    Article  Google Scholar 

  20. Liu Q, Wang J (2015) A second-order multi-agent network for bound-constrained distributed optimization. IEEE Trans Autom Control 60(12):3310–3315. https://doi.org/10.1109/TAC.2015.2416927

    Article  MathSciNet  MATH  Google Scholar 

  21. Yang S, Liu Q, Wang J (2017) A multi-agent system with a proportional-integral protocol for distributed constrained optimization. IEEE Trans Autom Control 62(7):3461–3467. https://doi.org/10.1109/TAC.2016.2610945

    Article  MathSciNet  MATH  Google Scholar 

  22. Zeng X, Yi P, Hong Y (2017) Distributed continuous-time algorithm for constrained convex optimizations via nonsmooth analysis approach. IEEE Trans Autom Control 62(10):5227–5233. https://doi.org/10.1109/TAC.2016.2628807

    Article  MathSciNet  MATH  Google Scholar 

  23. He X, Huang T, Yu J, Li C, Zhang Y (2019) A continuous-time algorithm for distributed optimization based on multiagent networks. IEEE Trans Syst Man Cybern Syst 49(12):2700–2709. https://doi.org/10.1109/TSMC.2017.2780194

    Article  Google Scholar 

  24. Che H, Wang J (2019) A collaborative neurodynamic approach to global and combinatorial optimization. Neural Netw 114:15–27. https://doi.org/10.1016/j.neunet.2019.02.002

    Article  MATH  Google Scholar 

  25. Li C, Yu X, Yu W, Huang T, Liu Z-W (2016) Distributed event-triggered scheme for economic dispatch in smart grids. IEEE Trans Ind Inf 12(5):1775–1785. https://doi.org/10.1109/TII.2015.2479558

    Article  Google Scholar 

  26. Li C, Yu X, Huang T, He X (2018) Distributed optimal consensus over resource allocation network and its application to dynamical economic dispatch. IEEE Trans Neural Netw Learn Syst 29(6):2407–2418. https://doi.org/10.1109/TNNLS.2017.2691760

    Article  Google Scholar 

  27. Qin S, Zhang YD, Wu Q, Amin MG (2014) Large-scale sparse reconstruction through partitioned compressive sensing. In: 2014 19th international conference on digital signal processing, pp 837–840. https://doi.org/10.1109/ICDSP.2014.6900784

  28. Zhao Y, Liao X, He X, Tang R (2021) Centralized and collective neurodynamic optimization approaches for sparse signal reconstruction via \( l_{1} \)-minimization. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2021.3085314

    Article  Google Scholar 

  29. Guenin B, Könemann J, Tunçel L (2014) A gentle introduction to optimization. Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9781107282094

    Book  MATH  Google Scholar 

  30. Candès EJ (2008) The restricted isometry property and its implications for compressed sensing. C R Math 346:589–592

    Article  MathSciNet  MATH  Google Scholar 

  31. Parikh N, Boyd SP (2014) Proximal algorithms. Found Trends Optim 1:127–239

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported in part by National Key R&D Program of China (No. 2018AAA0100101), and in part by National Natural Science Foundation of China (Grant No. 61932006), and in part by Chongqing technology innovation and application development project (Grant No. cstc2020jscx-msxmX0156).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Hongying Zheng or Xiaofeng Liao.

Ethics declarations

Conflict of interest

There are no conflicts of interest declared by the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, X., Zhao, Y., Zheng, H. et al. Neurodynamic approaches with derivative feedback for sparse signal reconstruction. Neural Comput & Applic 35, 9501–9515 (2023). https://doi.org/10.1007/s00521-022-08166-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-08166-5

Keywords