Skip to main content
Log in

A L-BFGS Based Learning Algorithm for Complex-Valued Feedforward Neural Networks

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

In this paper, a new learning algorithm is proposed for complex-valued feedforward neural networks (CVFNNs). The basic idea of this algorithm is that the descent directions of the cost function with respect to complex-valued parameters are calculated by limited-memory BFGS algorithm and the learning step is determined by Armijo line search method. Since the approximation of Hessian matrix is calculated by utilizing the information of the latest several iterations, the memory efficiency is improved. To keep away from the saturated ranges of activation functions, some gain parameters are adjusted together with weights and biases. Compared with some existing learning algorithms for CVFNNs, the convergence speed is faster and a deeper minima of the cost function can be reached by the developed algorithm. In addition, the effects of initial values of weights and biases on the efficiency and convergence speed of the learning algorithm are analyzed. The performance of the proposed algorithm is evaluated in comparison with some existing classifiers on a variety of benchmark classification problems. Experimental results show that better performance is achieved by our algorithm with relatively compact network structure.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Liu MB, Huang GB, Saratchandran P (2006) Complex-valued growing and pruning RBF neural networks for communication channel equalisation. IEE Proc Vis Image Signal Process 153(4):411–418

    Article  Google Scholar 

  2. Hirose A (2013) Complex-valued neural networks: advances and applications. Wiley, Hoboken

    Book  Google Scholar 

  3. Xu D, Shao H, Zhang H (2012) A new adaptive momentum algorithm for split-complex recurrent neural networks. Neurocomputing 93(2):133–136

    Article  Google Scholar 

  4. Hu J, Wang Z, Shen B, Gao H (2013) Quantised recursive filtering for a class of nonlinear systems with multiplicative noises and missing measurements. Int J Control 86(4):650–663

    Article  MathSciNet  MATH  Google Scholar 

  5. Hu J, Wang Z, Liu S, Gao H (2016) A variance-constrained approach to recursive state estimation for time-varying complex networks with missing measurements. Automatica 64:155–162

    Article  MathSciNet  MATH  Google Scholar 

  6. Xia JH, Rusli R, Kumta AS (2010) Feedforward neural network trained by BFGS algorithm for modeling plasma etching of silicon carbide. IEEE Trans Plasma Sic 38(2):142–148

    Article  Google Scholar 

  7. Huang T, Li C, Duan S, Starzyk J (2012) Robust exponential stability of uncertain delayed neural networks with stochastic perturbation and impulse effects. IEEE Trans Neural Netw Learn Sys 23:866–875

  8. Nitta T (2003) The computational power of complex-valued neuron. Lect Notes Comput Sci 2714(404):993–1002

    Article  MATH  Google Scholar 

  9. Nitta T (2004) Orthogonality of decision boundaries in complex-valued neural networks. Neural Comput 16(1):73–97

    Article  MATH  Google Scholar 

  10. Nitta T (2013) Local minima in hierarchical structures of complex-valued neural networks. Neural Netw 43(7):1–7

    Article  MATH  Google Scholar 

  11. Rudin W (1984) Real and complex analysis. McGraw-Hilt, New York

    MATH  Google Scholar 

  12. Leung H, Haykin S (1991) The complex backpropagation algorithm. IEEE Trans Signal Process 39(9):2101–2104

    Article  Google Scholar 

  13. Hirose A, Yoshida S (2012) Generalization characteristics of complex-valued feedforward neural networks in relation to signal coherence. IEEE Trans Neural Netw Learn Syst 23(4):541–551

    Article  Google Scholar 

  14. Kim T, Adali T (2002) Fully-complex multilayer perceptron network for nonlinear signal processing. J VLSI Sig Proc 32(1):29–43

    Article  MATH  Google Scholar 

  15. Yang SS, Ho CL, Siu S (2007) Sensitivity analysis of the split-complex valued multilayer perceptron due to the error of the i.i.d. inputs and weights. IEEE Trans Neural Netw 18(5):1280–1293

    Article  Google Scholar 

  16. Zhang H, Xu D, Zhang Y (2014) Boundedness and convergence of split-complex back-propagation algorithm with momentum and penalty. Neural Process Lett 39(3):297–307

    Article  Google Scholar 

  17. Amin MF, Murase K (2009) Single-layered complex-valued neural network for real-valued classification problems. Neurocomputing 72(4–6):945–955

    Article  Google Scholar 

  18. Georgiou GM, Koutsougeras C (1992) Complex domain backpropagation. IEEE Trans Circ Syst II 39(5):330–334

    Article  MATH  Google Scholar 

  19. Li C, Yu X, Huang T, He X (in press) Distributed optimal consensus over resource allocation network and its application to dynamical economic dispatch. IEEE Trans Neural Netw Learn Syst

  20. Ampazis N, Perantonis SJ (2002) Two highly efficient second-order algorithms for training feedforward networks. IEEE Trans Neural Netw 13(5):1064–1074

    Article  Google Scholar 

  21. Kim CT, Lee JJ (2008) Training two-layered feedforward networks with variable projection method. IEEE Trans Neural Netw 19(2):371–375

    Article  Google Scholar 

  22. Li C, Yu X, Huang T, Chen G, He X (2016) A generalized Hopfield network for nonsmooth constrained convex optimization: Lie derivative approach. IEEE Trans Neural Netw Learn Syst 27(2):308–321

    Article  MathSciNet  Google Scholar 

  23. Hafiz AR, Al-Nuaimi AY, Amin MF, Murase K (2015) Classification of skeletal wireframe representation of hand gesture using complex-valued neural network. Neural Process Lett 42(3):649–664

    Article  Google Scholar 

  24. Popa CA (2015) Quasi-Newton learning methods for complex-valued neural networks. In: Proceedings of the 2015 international joint conference on neural networks

  25. Nocedal J (1980) Updating quasi-Newton matrices with limited storage. Math Comput 35(35):773–782

    Article  MathSciNet  MATH  Google Scholar 

  26. Trentin E (2001) Networks with trainable amplitude of activation functions. Neural Netw 14(14):471–493

    Article  Google Scholar 

  27. Armijo L (1996) Minimization of functions having Lipschitz continuous first partial derivatives. Pac J Math 16(1):1–3

    Article  MathSciNet  MATH  Google Scholar 

  28. Savitha R, Suresh S, Sundararajan N (2012) Fast learning circular complex-valued extreme learning machine (CC-ELM) for real-valued classification problems. Inf Sci 187(1):277–290

    Article  MathSciNet  Google Scholar 

  29. Fletcher R (2013) Practical methods of optimization. Wiley, New York

    MATH  Google Scholar 

  30. Dennis JE, More JJ (1977) Quasi-Newton methods, motivation and theory. SIAM Rev 19(1):46–89

    Article  MathSciNet  MATH  Google Scholar 

  31. Frank A, Asuncion A (2010) UCI machine learning repository, Department of Information and Computer Science, University of California, Irvine, 2010. http://archive.ics.uci.edu/ml

  32. Chang CC, Lin CJ (2007) LIBSVM: A library for support vector machines. ACM Trans Intell Syst Technol 2(27):389–396

    Google Scholar 

  33. Rouhani M, Javan DS (2016) Two fast and accurate heuristic RBF learning rules for data classification. Neural Netw 75:150–161

    Article  Google Scholar 

  34. Liu Y, Huang H, Huang T (2016) An improved maximum spread algorithm with application to complex-valued RBF neural networks. Neurocomputing 216:261–267

    Article  Google Scholar 

  35. Sivachitra M, Savitha R, Suresh S (2015) A fully complex-valued fast learning classifier (FC-FLC) for real-valued classification problems. Neurocomputing 149:198–206

    Article  Google Scholar 

  36. Suresh S, Savitha R, Sundararajan N (2011) A sequential learning algorithm for complex-valued self-regulating resource allocation network-CSRAN. IEEE Trans Neural Netw 22(7):1061–1072

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the associate editor and anonymous reviewers for their constructive comments that have greatly improved the quality of this paper. This work was jointly supported by the National Natural Science Foundation of China under Grant nos. 61273122 and 61005047, and the Qing Lan Project of Jiangsu Province. This publication was made possible by NPRP Grant: NPRP 8-274-2-107 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the author[s].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to He Huang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, R., Huang, H., Qian, X. et al. A L-BFGS Based Learning Algorithm for Complex-Valued Feedforward Neural Networks. Neural Process Lett 47, 1271–1284 (2018). https://doi.org/10.1007/s11063-017-9692-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-017-9692-5

Keywords

Navigation