Abstract
In this paper, a new learning algorithm is proposed for complex-valued feedforward neural networks (CVFNNs). The basic idea of this algorithm is that the descent directions of the cost function with respect to complex-valued parameters are calculated by limited-memory BFGS algorithm and the learning step is determined by Armijo line search method. Since the approximation of Hessian matrix is calculated by utilizing the information of the latest several iterations, the memory efficiency is improved. To keep away from the saturated ranges of activation functions, some gain parameters are adjusted together with weights and biases. Compared with some existing learning algorithms for CVFNNs, the convergence speed is faster and a deeper minima of the cost function can be reached by the developed algorithm. In addition, the effects of initial values of weights and biases on the efficiency and convergence speed of the learning algorithm are analyzed. The performance of the proposed algorithm is evaluated in comparison with some existing classifiers on a variety of benchmark classification problems. Experimental results show that better performance is achieved by our algorithm with relatively compact network structure.
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11063-017-9692-5/MediaObjects/11063_2017_9692_Fig1_HTML.gif)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11063-017-9692-5/MediaObjects/11063_2017_9692_Fig2_HTML.gif)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11063-017-9692-5/MediaObjects/11063_2017_9692_Fig3_HTML.gif)
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Liu MB, Huang GB, Saratchandran P (2006) Complex-valued growing and pruning RBF neural networks for communication channel equalisation. IEE Proc Vis Image Signal Process 153(4):411–418
Hirose A (2013) Complex-valued neural networks: advances and applications. Wiley, Hoboken
Xu D, Shao H, Zhang H (2012) A new adaptive momentum algorithm for split-complex recurrent neural networks. Neurocomputing 93(2):133–136
Hu J, Wang Z, Shen B, Gao H (2013) Quantised recursive filtering for a class of nonlinear systems with multiplicative noises and missing measurements. Int J Control 86(4):650–663
Hu J, Wang Z, Liu S, Gao H (2016) A variance-constrained approach to recursive state estimation for time-varying complex networks with missing measurements. Automatica 64:155–162
Xia JH, Rusli R, Kumta AS (2010) Feedforward neural network trained by BFGS algorithm for modeling plasma etching of silicon carbide. IEEE Trans Plasma Sic 38(2):142–148
Huang T, Li C, Duan S, Starzyk J (2012) Robust exponential stability of uncertain delayed neural networks with stochastic perturbation and impulse effects. IEEE Trans Neural Netw Learn Sys 23:866–875
Nitta T (2003) The computational power of complex-valued neuron. Lect Notes Comput Sci 2714(404):993–1002
Nitta T (2004) Orthogonality of decision boundaries in complex-valued neural networks. Neural Comput 16(1):73–97
Nitta T (2013) Local minima in hierarchical structures of complex-valued neural networks. Neural Netw 43(7):1–7
Rudin W (1984) Real and complex analysis. McGraw-Hilt, New York
Leung H, Haykin S (1991) The complex backpropagation algorithm. IEEE Trans Signal Process 39(9):2101–2104
Hirose A, Yoshida S (2012) Generalization characteristics of complex-valued feedforward neural networks in relation to signal coherence. IEEE Trans Neural Netw Learn Syst 23(4):541–551
Kim T, Adali T (2002) Fully-complex multilayer perceptron network for nonlinear signal processing. J VLSI Sig Proc 32(1):29–43
Yang SS, Ho CL, Siu S (2007) Sensitivity analysis of the split-complex valued multilayer perceptron due to the error of the i.i.d. inputs and weights. IEEE Trans Neural Netw 18(5):1280–1293
Zhang H, Xu D, Zhang Y (2014) Boundedness and convergence of split-complex back-propagation algorithm with momentum and penalty. Neural Process Lett 39(3):297–307
Amin MF, Murase K (2009) Single-layered complex-valued neural network for real-valued classification problems. Neurocomputing 72(4–6):945–955
Georgiou GM, Koutsougeras C (1992) Complex domain backpropagation. IEEE Trans Circ Syst II 39(5):330–334
Li C, Yu X, Huang T, He X (in press) Distributed optimal consensus over resource allocation network and its application to dynamical economic dispatch. IEEE Trans Neural Netw Learn Syst
Ampazis N, Perantonis SJ (2002) Two highly efficient second-order algorithms for training feedforward networks. IEEE Trans Neural Netw 13(5):1064–1074
Kim CT, Lee JJ (2008) Training two-layered feedforward networks with variable projection method. IEEE Trans Neural Netw 19(2):371–375
Li C, Yu X, Huang T, Chen G, He X (2016) A generalized Hopfield network for nonsmooth constrained convex optimization: Lie derivative approach. IEEE Trans Neural Netw Learn Syst 27(2):308–321
Hafiz AR, Al-Nuaimi AY, Amin MF, Murase K (2015) Classification of skeletal wireframe representation of hand gesture using complex-valued neural network. Neural Process Lett 42(3):649–664
Popa CA (2015) Quasi-Newton learning methods for complex-valued neural networks. In: Proceedings of the 2015 international joint conference on neural networks
Nocedal J (1980) Updating quasi-Newton matrices with limited storage. Math Comput 35(35):773–782
Trentin E (2001) Networks with trainable amplitude of activation functions. Neural Netw 14(14):471–493
Armijo L (1996) Minimization of functions having Lipschitz continuous first partial derivatives. Pac J Math 16(1):1–3
Savitha R, Suresh S, Sundararajan N (2012) Fast learning circular complex-valued extreme learning machine (CC-ELM) for real-valued classification problems. Inf Sci 187(1):277–290
Fletcher R (2013) Practical methods of optimization. Wiley, New York
Dennis JE, More JJ (1977) Quasi-Newton methods, motivation and theory. SIAM Rev 19(1):46–89
Frank A, Asuncion A (2010) UCI machine learning repository, Department of Information and Computer Science, University of California, Irvine, 2010. http://archive.ics.uci.edu/ml
Chang CC, Lin CJ (2007) LIBSVM: A library for support vector machines. ACM Trans Intell Syst Technol 2(27):389–396
Rouhani M, Javan DS (2016) Two fast and accurate heuristic RBF learning rules for data classification. Neural Netw 75:150–161
Liu Y, Huang H, Huang T (2016) An improved maximum spread algorithm with application to complex-valued RBF neural networks. Neurocomputing 216:261–267
Sivachitra M, Savitha R, Suresh S (2015) A fully complex-valued fast learning classifier (FC-FLC) for real-valued classification problems. Neurocomputing 149:198–206
Suresh S, Savitha R, Sundararajan N (2011) A sequential learning algorithm for complex-valued self-regulating resource allocation network-CSRAN. IEEE Trans Neural Netw 22(7):1061–1072
Acknowledgements
The authors would like to thank the associate editor and anonymous reviewers for their constructive comments that have greatly improved the quality of this paper. This work was jointly supported by the National Natural Science Foundation of China under Grant nos. 61273122 and 61005047, and the Qing Lan Project of Jiangsu Province. This publication was made possible by NPRP Grant: NPRP 8-274-2-107 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the author[s].
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wu, R., Huang, H., Qian, X. et al. A L-BFGS Based Learning Algorithm for Complex-Valued Feedforward Neural Networks. Neural Process Lett 47, 1271–1284 (2018). https://doi.org/10.1007/s11063-017-9692-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-017-9692-5