Abstract
We apply a novel black box approximation algorithm, called IBHM, to learn both structure and parameters of a nonlinear regression model. IBHM incrementally creates a model as a weighted sum of activation functions which are nonlinear functions of the input vector. In each iteration the error between the current model and the approximated function is analyzed and a function is selected with the highest possible correlation with the observed error. This function is then added to the set of the model’s activation functions and the process repeats. In effect IBHM determines both the model structure and parameter values. In this paper we briefly outline the method and present the results on the NN3 benchmark set. We compare results with other state-of-the-art methods that share a similar model structure: Multilayer Perceptron with a single hidden layer and Support Vector Regression.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arabas, J., Dydyński, A.: An Algorithm of Incremental Construction of Nonlinear Parametric Approximators. In: Arabas, J. (ed.) Evolutionary Computation and Global Optimization 2006, pp. 31–38. Warsaw Univ. of Techn. Press, Poland (2006)
Arabas, J., Dydyński, A.: Nonlinear Time-Series Modeling and Prediction using Correlation Analysis. Proc. in Applied Mathematics and Mechanics 7, 2030013–2030014 (2007)
Box, G., Jenkins, G., Reinsel, G.: Time Series Analysis, Forecasting and Control. Prentice-Hall, Englewood Cliffs (1994)
Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines (2001), http://www.csie.ntu.edu.tw/~cjlin/libsvm
Fahlman, S., Lebiere, C.: The cascade-correlation learning architecture. Advances in Neural Information Processing Systems 2, 524–532 (1990)
Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. IEEE Congr. on Evol. Comp., 1769–1776 (2005)
Martínez-Estudillo, F., et al.: Evolutionary product-unit neural networks classifiers. Neurocomputing 72(1), 548–561 (2009)
Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice-Hall, Englewood Cliffs (1999)
Artificial Neural Network and Computational Intelligence Forecasting Competition, http://www.neural-forecasting-competition.com/NN3/index.htm
Schölkopf, B., Smola, A.: Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT Press, Cambridge (2002)
Reed, R.: Pruning algorithms — a survey. IEEE Trans. on Neural Networks 4(3), 740–747 (1993)
Wedge, D., et al.: On global-local artificial neural networks for function approximation. IEEE Trans. on Neural Networks 17(4), 942–952 (2006)
Yao, X.: Evolving Artificial Neural Networks. Proc. of thr. IEEE 87(9), 1423–1447 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zawistowski, P., Arabas, J. (2011). Benchmarking IBHM Method Using NN3 Competition Dataset. In: Corchado, E., Kurzyński, M., Woźniak, M. (eds) Hybrid Artificial Intelligent Systems. HAIS 2011. Lecture Notes in Computer Science(), vol 6678. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21219-2_34
Download citation
DOI: https://doi.org/10.1007/978-3-642-21219-2_34
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21218-5
Online ISBN: 978-3-642-21219-2
eBook Packages: Computer ScienceComputer Science (R0)