Abstract
In this paper, a new learning algorithm which encodes additional constraints into feedforward neural networks is proposed for function approximation problem. The algorithm incorporates two kinds of constraints into single hidden layered feedforward neural networks from a priori information of function approximation problem, which are architectural constraints and connection weight constraints. On one hand, the activation functions of the hidden neurons are a class of specific polynomial functions based on Taylor series expansions. On the other hand, the connection weight constraints are obtained from the first-order derivative information and the second-order one of the approximated function. The new algorithm has been shown by theoretical justifications and experimental results to have better generalization performance than other traditional learning ones.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Ng, S.C., Cheung, C.C., Leung, S.H.: Magnified Gradent Function with Deterministic Weight Modification in Adaptive Learning. IEEE Transactions on Neural Networks 15(6), 1411–1423 (2004)
Karras, D.A.: An Efficient Constrained Training Algorithm for Feedforward Networks. IEEE Trans. Neural Networks 6, 1420–1434 (1995)
Baum, E., Haussler, D.: What Size Net Gives Valid Generalization? Neural Computation 1, pp. 151–160 (1989)
Huang, D.S.: A Constructive Approach for Finding Arbitrary Roots of Polynomials by Neural Networks. IEEE Transactions on Neural Networks 15, 477–491 (2004)
Huang, D.S., Chi, Z.: Finding Roots of Arbitrary High Order Polynomials Based on Neural Network Recursive Partitioning Method. Science in China Series F Information Sciences 47, 232–245 (2004)
Huang, D.S., Horace, H.S., Chi, Z.: A Neural Root Finder of Polynomials Based on Root Momnets. Neural Computation 16, 1721–1762 (2004)
Huang, D.S., Horace, H.S., Chi, Z., Wong, H.S.: Dilation Method for Finding Close Roots of Polynomials Based on Constrained Learning Neural Networks. Physics Letters A 309, 443–451 (2003)
Jeong, S.Y., Lee, S.Y.: Adaptive Learning Algorithms to Incorporate Additional Functional Constraints into Neural Networks. Neurocomputing 35, 73–90 (2000)
Jeong, D.G., Lee, S.Y.: Merging Back-propagation and Hebbian Learning Rules for Robust Classifications. Neural Networks 9, 1213–1222 (1996)
Han, F., Huang, D.S., Cheung, Y.M., Huang, G.B.: A New Modified Hybrid Learning Algorithm for Feedforward Neural Networks. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3496, pp. 572–577. Springer, Heidelberg (2005)
Huang, D.S.: Constrained Learning Algorithms For Finding The Roots of Polynomials: A Case Study. IEEE Region 10 Technical Conf on Computers, Communications, Control and Power Engineering, vol. III, Beijing, China, pp. 1516–1520 (2002)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Han, F., Ling, QH. (2007). A New Learning Algorithm for Function Approximation by Encoding Additional Constraints into Feedforward Neural Network. In: Huang, DS., Heutte, L., Loog, M. (eds) Advanced Intelligent Computing Theories and Applications. With Aspects of Contemporary Intelligent Computing Techniques. ICIC 2007. Communications in Computer and Information Science, vol 2. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74282-1_8
Download citation
DOI: https://doi.org/10.1007/978-3-540-74282-1_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74281-4
Online ISBN: 978-3-540-74282-1
eBook Packages: Computer ScienceComputer Science (R0)