Skip to main content

A New Learning Algorithm for Function Approximation by Encoding Additional Constraints into Feedforward Neural Network

  • Conference paper
Advanced Intelligent Computing Theories and Applications. With Aspects of Contemporary Intelligent Computing Techniques (ICIC 2007)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 2))

Included in the following conference series:

  • 975 Accesses

Abstract

In this paper, a new learning algorithm which encodes additional constraints into feedforward neural networks is proposed for function approximation problem. The algorithm incorporates two kinds of constraints into single hidden layered feedforward neural networks from a priori information of function approximation problem, which are architectural constraints and connection weight constraints. On one hand, the activation functions of the hidden neurons are a class of specific polynomial functions based on Taylor series expansions. On the other hand, the connection weight constraints are obtained from the first-order derivative information and the second-order one of the approximated function. The new algorithm has been shown by theoretical justifications and experimental results to have better generalization performance than other traditional learning ones.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ng, S.C., Cheung, C.C., Leung, S.H.: Magnified Gradent Function with Deterministic Weight Modification in Adaptive Learning. IEEE Transactions on Neural Networks 15(6), 1411–1423 (2004)

    Article  Google Scholar 

  2. Karras, D.A.: An Efficient Constrained Training Algorithm for Feedforward Networks. IEEE Trans. Neural Networks 6, 1420–1434 (1995)

    Article  Google Scholar 

  3. Baum, E., Haussler, D.: What Size Net Gives Valid Generalization? Neural Computation 1, pp. 151–160 (1989)

    Google Scholar 

  4. Huang, D.S.: A Constructive Approach for Finding Arbitrary Roots of Polynomials by Neural Networks. IEEE Transactions on Neural Networks 15, 477–491 (2004)

    Article  Google Scholar 

  5. Huang, D.S., Chi, Z.: Finding Roots of Arbitrary High Order Polynomials Based on Neural Network Recursive Partitioning Method. Science in China Series F Information Sciences 47, 232–245 (2004)

    Article  MATH  Google Scholar 

  6. Huang, D.S., Horace, H.S., Chi, Z.: A Neural Root Finder of Polynomials Based on Root Momnets. Neural Computation 16, 1721–1762 (2004)

    Article  MATH  Google Scholar 

  7. Huang, D.S., Horace, H.S., Chi, Z., Wong, H.S.: Dilation Method for Finding Close Roots of Polynomials Based on Constrained Learning Neural Networks. Physics Letters A 309, 443–451 (2003)

    Article  MATH  Google Scholar 

  8. Jeong, S.Y., Lee, S.Y.: Adaptive Learning Algorithms to Incorporate Additional Functional Constraints into Neural Networks. Neurocomputing 35, 73–90 (2000)

    Article  MATH  Google Scholar 

  9. Jeong, D.G., Lee, S.Y.: Merging Back-propagation and Hebbian Learning Rules for Robust Classifications. Neural Networks 9, 1213–1222 (1996)

    Article  Google Scholar 

  10. Han, F., Huang, D.S., Cheung, Y.M., Huang, G.B.: A New Modified Hybrid Learning Algorithm for Feedforward Neural Networks. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3496, pp. 572–577. Springer, Heidelberg (2005)

    Google Scholar 

  11. Huang, D.S.: Constrained Learning Algorithms For Finding The Roots of Polynomials: A Case Study. IEEE Region 10 Technical Conf on Computers, Communications, Control and Power Engineering, vol. III, Beijing, China, pp. 1516–1520 (2002)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

De-Shuang Huang Laurent Heutte Marco Loog

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Han, F., Ling, QH. (2007). A New Learning Algorithm for Function Approximation by Encoding Additional Constraints into Feedforward Neural Network. In: Huang, DS., Heutte, L., Loog, M. (eds) Advanced Intelligent Computing Theories and Applications. With Aspects of Contemporary Intelligent Computing Techniques. ICIC 2007. Communications in Computer and Information Science, vol 2. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74282-1_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74282-1_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74281-4

  • Online ISBN: 978-3-540-74282-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics