Abstract
In this paper, a constrained learning algorithm is proposed for function approximation. The algorithm incorporates constraints into single hidden layered feedforward neural networks from the a priori information of the approximated function. The activation functions of the hidden neurons are specific polynomial functions based on Taylor series expansions, and the connection weight constraints are obtained from the second-order derivative information of the approximated function. The new algorithm has been shown by experimental results to have better generalization performance than other traditional learning ones.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Nasr, M.B., Chtourou, M.: A Fuzzy Neighborhood-Based Training Algorithm for Feedforward Neural Networks. Neural Computing and Applications 18(2), 127–133 (2009)
Ng, S.C., Cheung, C.C., Leung, S.H.: Magnified Gradient Function with Deterministic Weight Modification in Adaptive Learning. IEEE Transactions on Neural Networks 15(6), 1411–1423 (2004)
Karras, D.A.: An Efficient Constrained Training Algorithm for Feedforward Networks. IEEE Trans. Neural Networks 6, 1420–1434 (1995)
Baum, E., Haussler, D.: What Size Net Gives Valid Generalization? Neural Computation 1, 151–160 (1989)
Huang, D.S., Chi, Z.R.: Finding Roots of Arbitrary High Order Polynomials Based on Neural Network Recursive Partitioning Method. Science in China Series F Information Sciences 47, 232–245 (2004)
Huang, D.S., Ip, H.H.S., Chi, Z.R.: A Neural Root Finder of Polynomials Based on Root Momnets. Neural Computation 16, 1721–1762 (2004)
Jeong, S.Y., Lee, S.Y.: Adaptive Learning Algorithms to Incorporate Additional Functional Constraints into Neural Networks. Neurocomputing 35, 73–90 (2000)
Jeong, D.G., Lee, S.Y.: Merging Back-propagation and Hebbian Learning Rules for Robust Classifications. Neural Networks 9, 1213–1222 (1996)
Han, F., Huang, D.-S., Cheung, Y.-m., Huang, G.-B.: A new modified hybrid learning algorithm for feedforward neural networks. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3496, pp. 572–577. Springer, Heidelberg (2005)
Han, F., Huang, D.S.: A New Constrained Learning Algorithm for Function Approximation by Encoding a Priori Information into Feedforward Neural Networks. Neural Computing and Applications 17(5-6), 433–439 (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ling, QH., Han, F. (2009). A Constrained Approximation Algorithm by Encoding Second-Order Derivative Information into Feedforward Neural Networks. In: Huang, DS., Jo, KH., Lee, HH., Kang, HJ., Bevilacqua, V. (eds) Emerging Intelligent Computing Technology and Applications. With Aspects of Artificial Intelligence. ICIC 2009. Lecture Notes in Computer Science(), vol 5755. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04020-7_100
Download citation
DOI: https://doi.org/10.1007/978-3-642-04020-7_100
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04019-1
Online ISBN: 978-3-642-04020-7
eBook Packages: Computer ScienceComputer Science (R0)