Abstract
The aim of this paper is to construct a modified greedy algorithm applicable for an ill-posed function approximation problem in presence of data noise. We provide a detailed convergence analysis of the algorithm in presence of noise, and discuss the choice of the iteration parameters. This yields a stopping rule for which the corresponding algorithm is a regularization method with convergence rates in L2 and under weak additional assumptions also in Sobolev-spaces of positive order.
Finally, we discuss the application of the modified greedy algorithm to sigmoidal neural networks and radial basis functions, and supplement the theoretical results by numerical experiments.
Similar content being viewed by others
References
Barron, A. R.: Universal approximation bounds for superpositions of a sigmoidal function. IEEE Trans. Information Theory 39(3), 930–945 (1993).
Bodenhofer, U. Burger, M., Engl, H. W., Haslinger, J.: Regularized data-driven construction of fuzzy controller. J. Inverse Ill-posed Problems 10, 319–344 (2002).
Burger, M., Engl, H. W.: Training neural networks with noisy data as an ill-posed problem. Adv. Comput. Math. 13, 335–354 (2000).
Burger, M., Hofinger, A.: Regularized greedy algorithms for neural network training with data noise. SFB Report 03–4, Johannes-Kepler-University Linz, SFB “Numerical and Symbolic Scientific Computing,” (2003).
Burger, M., Neubauer, A.: Analysis of Tikhonov regularization for function approximation by neural networks. Neural Networks 16, 79–90 (2003).
Dingankar, A. T., Sandberg, I. W.: A note on error bounds for approximation in inner product spaces. Circuits, Systems Signal Processing 15(4), 519–522 (1996).
Engl, H. W., Hanke, M., Neubauer, A.: Regularization of inverse problems. Dordrecht: Kluwer 1996.
Hofinger, A.: Iterative regularization and training of neural networks. Master’s thesis, University of Linz 2003.
Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Networks 2, 359–366 (1989).
Jones, L. K.: A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training. Ann. Stat. 20, 608–613 (1992).
Linde, W.: Infinitely divisible and stable measures on banach spaces. Leipzig: Teubner 1983.
Lions J. L., Magenes, E.: Non-homogeneous boundary value problems and applications, vol. 1. Berlin Heidelberg: Springer 1972.
Sjöberg, J., Zhang, Q., Ljung, L., Benveniste, A., Deylon, B., Glorennec, P.-Y., Hjalmarsson, H., Juditsky, A.: Nonlinear black-box modeling in system identification: a unified overview. Automatica 31(12), 1691–1724 (1995).
Werner, D.: Funktionalanalysis. Berlin Heidelberg: Springer 1995.
Acknowledgments.
The authors thank Prof. Heinz W. Engl (University of Linz) for useful and stimulating discussions. Financial support is acknowledged to the Austrian Science Foundation FWF through project SFB F 013 /08.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Burger, M., Hofinger, A. Regularized Greedy Algorithms for Network Training with Data Noise. Computing 74, 1–22 (2005). https://doi.org/10.1007/s00607-004-0081-3
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00607-004-0081-3