Skip to main content

Integrated Analytic Framework for Neural Network Construction

  • Conference paper
Advances in Neural Networks – ISNN 2007 (ISNN 2007)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4492))

Included in the following conference series:

  • 1273 Accesses

Abstract

This paper investigates the construction of a wide class of singlehidden layer neural networks (SLNNs) with or without tunable parameters in the hidden nodes. It is a challenging problem if both the parameter training and determination of network size are considered simultaneously. Two alternative network construction methods are considered in this paper. Firstly, the discrete construction of SLNNs is introduced. The main objective is to select a subset of hidden nodes from a pool of candidates with parameters fixed ‘a priori’. This is called discrete construction since there are no parameters in the hidden nodes that need to be trained. The second approach is called continuous construction as all the adjustable network parameters are trained on the whole parameter space along the network construction process. In the second approach, there is no need to generate a pool of candidates, and the network grows one by one with the adjustable parameters optimized. The main contribution of this paper is to show that the network construction can be done using the above two alternative approaches, and these two approaches can be integrated within a unified analytic framework, leading to potentially significantly improved model performance and/or computational efficiency.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Igelnik, B., Pao, Y.H.: Additional Perspectives of Feedforward Neural-nets and the Functional-link. In: IJCNN ’93, Nagoya, Japan (1993)

    Google Scholar 

  2. Adeney, K.M., Korenberg, M.J.: Iterative Fast Orthogonal Search Algorithm for MDL-based Training of Generalized Single-layer Network. Neural Networks 13, 787–799 (2000)

    Article  Google Scholar 

  3. Huang, G.-B., Saratchandran, P., Sundararajan, N.: A Generalized Growing and Pruning RBF (GGAP-RBF) Neural Network for Function Approximation. IEEE Trans. Neural Networks 16, 57–67 (2005)

    Article  Google Scholar 

  4. Chen, S., Billings, S.A.: Neural Network for Nonlinear Dynamic System Modelling and Identification. International Journal of Control 56, 319–346 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  5. Zhu, Q.M., Billings, S.A.: Fast Orthogonal Identification of Nonlinear Stochastic Models and Radial Basis Function Neural Networks. Int. J. Control 64(5), 871–886 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  6. Chen, S., Cowan, C.F.N., Grant, P.M.: Orthogonal Least Squares Learning Algorithm for Radial Basis Functions. IEEE Trans. Neural Networks 2, 302–309 (1991)

    Article  Google Scholar 

  7. Li, K., Peng, J., Irwin, G.W.: A Fast Nonlinear Model Identification Method. IEEE Trans. Automatic Control 50(8), 1211–1216 (2005)

    Article  MathSciNet  Google Scholar 

  8. Akaike, H.: A New Look at the Statistical Model Identification. J. R. Statist. Soc. Ser. B 36, 117–147 (1974)

    Google Scholar 

  9. Li, K., Peng, J., Bai, E.-W.: A Two-stage Algorithm for Identification of Nonlinear Dynamic Systems. Automatica 42(7), 1189–1197 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  10. Huang, G.B., Chen, L., Siew, C.K.: Universal Approximation Using Incremental Constructive Feedforward Networks with Random Hidden Nodes. IEEE Trans. Neural Networks 17(4), 79–892 (2006)

    Google Scholar 

  11. Peng, J., Li, K., Huang, D.S.: A Hybrid forward Algorithm for RBF Neural Network Construction. IEEE Trans. Neural Networks 17(6), 1439–1451 (2006)

    Article  Google Scholar 

  12. Li, K., Peng, J., Fei, M.: Real-time Construction of Neural Networks. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. LNCS, vol. 4131, pp. 140–149. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  13. Adeney, K.M., Korenberg, M.J.: On the Use of Separable Volterra Networks to Model Discrete-time Volterra Systems. IEEE Trans. Neural Networks 12(1), 174–175 (2001)

    Article  Google Scholar 

  14. Nikolaev, N., Iba, H.: Learning Polynomial Feedforward Networks by Genetic Programming and Backpropagation. IEEE Trans. Neural Networks 14(2), 337–350 (2003)

    Article  Google Scholar 

  15. Weingaertner, D., Tatai, V.K., Gudwin, R.R., Von Zuben, F.J.: Hierarchical Evolution of Heterogeneous Neural Networks. In: Proceedings of the 2002 Congress on Evolutionary Computation (CEC2002), vol. 2, pp. 1775–1780 (2002)

    Google Scholar 

  16. Huang, G.B., Zhu, Q.Y., Mao, K.Z., Siew, C.K., Saratchandran, P., Sundararajan, N.: Can Threshold Networks be Trained Directly? IEEE Trans. Circuits and Systems-II: Express Briefs 53(3), 187–191 (2006)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Derong Liu Shumin Fei Zengguang Hou Huaguang Zhang Changyin Sun

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Li, K., Peng, JX., Fei, M., Li, X., Yu, W. (2007). Integrated Analytic Framework for Neural Network Construction. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4492. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72393-6_58

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72393-6_58

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72392-9

  • Online ISBN: 978-3-540-72393-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics