Abstract
This paper investigates the construction of a wide class of singlehidden layer neural networks (SLNNs) with or without tunable parameters in the hidden nodes. It is a challenging problem if both the parameter training and determination of network size are considered simultaneously. Two alternative network construction methods are considered in this paper. Firstly, the discrete construction of SLNNs is introduced. The main objective is to select a subset of hidden nodes from a pool of candidates with parameters fixed ‘a priori’. This is called discrete construction since there are no parameters in the hidden nodes that need to be trained. The second approach is called continuous construction as all the adjustable network parameters are trained on the whole parameter space along the network construction process. In the second approach, there is no need to generate a pool of candidates, and the network grows one by one with the adjustable parameters optimized. The main contribution of this paper is to show that the network construction can be done using the above two alternative approaches, and these two approaches can be integrated within a unified analytic framework, leading to potentially significantly improved model performance and/or computational efficiency.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Igelnik, B., Pao, Y.H.: Additional Perspectives of Feedforward Neural-nets and the Functional-link. In: IJCNN ’93, Nagoya, Japan (1993)
Adeney, K.M., Korenberg, M.J.: Iterative Fast Orthogonal Search Algorithm for MDL-based Training of Generalized Single-layer Network. Neural Networks 13, 787–799 (2000)
Huang, G.-B., Saratchandran, P., Sundararajan, N.: A Generalized Growing and Pruning RBF (GGAP-RBF) Neural Network for Function Approximation. IEEE Trans. Neural Networks 16, 57–67 (2005)
Chen, S., Billings, S.A.: Neural Network for Nonlinear Dynamic System Modelling and Identification. International Journal of Control 56, 319–346 (1992)
Zhu, Q.M., Billings, S.A.: Fast Orthogonal Identification of Nonlinear Stochastic Models and Radial Basis Function Neural Networks. Int. J. Control 64(5), 871–886 (1996)
Chen, S., Cowan, C.F.N., Grant, P.M.: Orthogonal Least Squares Learning Algorithm for Radial Basis Functions. IEEE Trans. Neural Networks 2, 302–309 (1991)
Li, K., Peng, J., Irwin, G.W.: A Fast Nonlinear Model Identification Method. IEEE Trans. Automatic Control 50(8), 1211–1216 (2005)
Akaike, H.: A New Look at the Statistical Model Identification. J. R. Statist. Soc. Ser. B 36, 117–147 (1974)
Li, K., Peng, J., Bai, E.-W.: A Two-stage Algorithm for Identification of Nonlinear Dynamic Systems. Automatica 42(7), 1189–1197 (2006)
Huang, G.B., Chen, L., Siew, C.K.: Universal Approximation Using Incremental Constructive Feedforward Networks with Random Hidden Nodes. IEEE Trans. Neural Networks 17(4), 79–892 (2006)
Peng, J., Li, K., Huang, D.S.: A Hybrid forward Algorithm for RBF Neural Network Construction. IEEE Trans. Neural Networks 17(6), 1439–1451 (2006)
Li, K., Peng, J., Fei, M.: Real-time Construction of Neural Networks. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. LNCS, vol. 4131, pp. 140–149. Springer, Heidelberg (2006)
Adeney, K.M., Korenberg, M.J.: On the Use of Separable Volterra Networks to Model Discrete-time Volterra Systems. IEEE Trans. Neural Networks 12(1), 174–175 (2001)
Nikolaev, N., Iba, H.: Learning Polynomial Feedforward Networks by Genetic Programming and Backpropagation. IEEE Trans. Neural Networks 14(2), 337–350 (2003)
Weingaertner, D., Tatai, V.K., Gudwin, R.R., Von Zuben, F.J.: Hierarchical Evolution of Heterogeneous Neural Networks. In: Proceedings of the 2002 Congress on Evolutionary Computation (CEC2002), vol. 2, pp. 1775–1780 (2002)
Huang, G.B., Zhu, Q.Y., Mao, K.Z., Siew, C.K., Saratchandran, P., Sundararajan, N.: Can Threshold Networks be Trained Directly? IEEE Trans. Circuits and Systems-II: Express Briefs 53(3), 187–191 (2006)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Li, K., Peng, JX., Fei, M., Li, X., Yu, W. (2007). Integrated Analytic Framework for Neural Network Construction. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4492. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72393-6_58
Download citation
DOI: https://doi.org/10.1007/978-3-540-72393-6_58
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72392-9
Online ISBN: 978-3-540-72393-6
eBook Packages: Computer ScienceComputer Science (R0)