Abstract
This paper presents a fast and new deterministic model selection methodology for incremental radial basis function neural network (RBFNN) construction in time series prediction problems. The development of such special designed methodology is motivated by the problems that arise when using a K-fold cross-validation-based model selection methodology for this paradigm: its random nature and the subjective decision for a proper value of K, resulting in large bias for low values and high variance and computational cost for high values. Taking into account these drawbacks, the proposed model selection approach is a combined algorithm that takes advantage of two balanced and representative training and validation sets for their use in RBFNN initialization, optimization and network model evaluation. This way, the model prediction accuracy is improved, getting small variance and bias, reducing the computation time spent in selecting the model and avoiding random and computationally expensive model selection methodologies based on K-fold cross-validation procedures.



Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Weigend AS, Gershenfeld NA (1993) Time series prediction: forecasting the future and understanding the past. Addison-Wesley, Reading
Qi M, Zhang GP (2001) An investigation of model selection criteria for neural network time series forecasting. Eur J Oper Res 132:666–680
Ghazali R, Hussain AJ et al (2008) The application of ridge polynomial neural network to multi-step ahead financial time series prediction. Neural Comput Appl 17(3):311–323
Juhos I, Makra L, Tóth B (2008) The behaviour of the multi-layer perceptron and the support vector regression learning methods in the prediction of NO and NO2 concentrations in Szeged, Hungary. Neural Comput Appl 18(2):193–205
Harpham C, Dawson CW (2006) The effect of different basis functions on a radial basis function network for time series prediction: a comparative study. Neurocomputing 69:2161–2170
Du H, Zhang N (2008) Time series prediction using evolving radial basis function neural networks with new encoding scheme. Neurocomputing 71:1388–1400
Zhang G, Patuwo BE, Hu MY (1998) Forecasting with artificial neural networks: the state of the art. Int J Forecast 14:35–62
Menezes JMP, Barreto GA (2008) Long-term time series prediction with the NARX network: an empirical evaluation. Neurocomputing 71(16):3335–3343
Carpinteiro OAS, Lima I et al (2006) A hierarchical neural model with time windows in long-term electrical load forecasting. Neural Comput Appl 16(4–5):465–470
Coyle D, Prasad G, McGinnity TM (2006) A time-series prediction approach for feature extraction in a brain-computer interface. IEEE Trans Neural Syst Rehabil Eng 13(4):461–467
Atiya AF, El-Shoura SM, Shaheen SI, El-Sherif MS (1999) A comparison between neural-network forecasting techniques-case study: river flow forecasting. IEEE Trans Neural Netw 10((2):402–409
Palit AK, Popovic D (2005) Computational intelligence in time series forecasting. Springer, Berlin
Park J, Sandberg IW (1991) Universal approximation using radial basis function networks. Neural Comput 3(2):246–257
Aran O, Yildiz OT, Alpaydin E (2009) On incremental framework based on cross-validation for estimating the architecture of a multilayer perceptron. Int J Pattern Recognit Aritif Intell 23(2):159–190
Kaminski W, Strumillo P (1997) Kernel orthonormalization in radial basis function neural networks. IEEE Trans Neural Netw 8(5):1177–1183
Gomm JB, Yu DL (2000) Selecting radial basis function network centers with recursive orthogonal least squares training. IEEE Trans Neural Netw 11(2):306–314
Paetz J (2004) Reducing the number of neurons in radial basis function networks with dynamic decay adjustment. Neuro Comput 62:79–91
Alpaydin E (1991) GAL: networks that grow when they learn and shrink when they forget. Technical report, TR-91-032, ICSI, Berkeley, CA
Kwok T, Yeung D (1997) Constructive algorithms for structure learning in feedforward neural networks for regression problems. IEEE Trans Neural Netw 8(3):630–645
Leung H, Dubash N, Xie N (2002) Detection of small objects in clutter using a GA-RBF neural network. IEEE Trans Aerosp Eelctron Syst 38(1):98–118
Akaike H (1973) Information theory and an extension of the maximum likelihood principle. In: International symposium on information theory, 2nd edn. Tsahkadsor, Armenian SSR, pp 267–281
Schwarz G (1978) Estimating the dimension of a model. Ann Stat 6:461–464
Rissanen J (1975) Modeling by shortest data description. Automatica 14:465–471
Lendasse A, Wertz V, Verleysen M (2003) Model selection with cross-validations and bootstraps—application to time series prediction RBFN models. In: Joint international conference on artificial neural networks (ICANN)/international on neural information processing (ICONIP), Proceedings, Springer, Lecture notes in computer science, vol 2714, pp 573–580
Constantinopoulos C, Likas A (2006) An incremental training method for the probabilistic RBF network. IEEE Trans Neural Netw 17(4):966–974
Gholipour A, Araabi BN, Lucas C (2006) Predicting chaotic time series using neural and neurofuzzy models: a comparative study. Neural Process Lett 24:217–239
Kohavi R (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the 14th international journal conference on artificial intelligence, Canada, vol 2, pp 1137–1143
Efron B, Tibshirani RJ (1997) On cross-validation: the .632+ bootstrap method. J Am Stat Assoc 92:548–560
Neural Network Toolbox 6 User’s Guide (1992–2009) The MathWorks, Inc
Guillén A, González J, Rojas I, Pomares H, Herrera LJ, Valenzuela O, Prieto A (2007) Using fuzzy logic to improve a clustering technique for function approximation. Neurocomputing 70:2853–2860
Rojas I, Pomares H, González J, Bernier JL, Ros E, Pelayo FJ, Prieto A (2000) Analysis of the functional block involved in the designed of radial basis function networks. Neural Process Lett 12(1):1–17
Guillén A, Pomares H, Rojas I, González J, Herrera LJ, Rojas F, Valenzuela O (2008) Studying possibility in a clustering algorithm for RBFNN design for function approximation. Neural Comput Appl 17(1):75–89
Park J, Sandberg IW (1993) Approximation and radial basis function networks. Neural Comput 5:305–316
Benoudjit N, Verleysen M (2003) On the kernel widths in radial basis function networks. Neural Process Lett 18(2):139–154
Karayiannis NB, Mi GW (1997) Growing radial basis neural networks: merging supervised and unsupervised learning with network growth techniques. IEEE Trans Neural Netw 8(6):1492–1506
Golub G, Loan CV (1961) Matrix computations. The Johns Hopkins University Press, Baltimore
González J, Rojas I, Ortega J et al (2003) Multiobjective evolutionary optimization of the size, shape, and position parameters of radial basis function networks for function approximation. IEEE Trans Neural Netw 14(6):1478–1495
Singh V, Gupta I, Gupta HO (2007) ANN-based estimator for distillation using Levenberg-Marquardt approach. Eng Appl Artif Ingell 20:249–259
May RM (1976) Simple mathematical models with very complicated dynamics. Nature 261:459–467
Hénon M (1976) A two-dimensional mapping with a strange attractor. Commun Math Phys 50(1):69–77
Mackey MC, Glass L (1977) Oscillation and chaos in physiological control-systems. Science 197(4300):287–289
Jayawardena AW, Xu PC et al (2006) Determining the structure of a radial basis function network for prediction of nonlinear hydrological time series. Hydrological Sci J 51(1):21–44
Yao X, Liu Y (1996) EPNet for chaotic time-series prediction. In: First Asia-Pacific conference on simulated evolution and learning, proceedings, Springer, Lecture notes in computer science, vol 1285, pp 146–156
Chen YH, Yang B et al (2005) Time-series forecasting using flexible neural tree model. Inf Sci 174(3–4):219–235
Box GEP, Hunter WG, Hunter JS (1978) Statistics for experimenters: an introduction to design, data analysis and model building. Wiley, NewYork
Box GEP, Jenkins GM (1976) Time series analysis: forecasting and control. Wiley, London
Balaguer E, Palomares A, Soria E, Martin-Guerrero JD (2008) Predicting service request in support centres based on nonlinear dynamics, ARMA modeling and neural networks. Expert Syst Appl 34(1):665–672
Shiblee M, Kalra PK, Chandra B (2009) Time series prediction with multilayer perceptron (MLP): a new generalized error based approach. Adv Neuro-Inf Process 5507:37–44
Boyacioglu MA, Avci D (2010) An adaptive network-based fuzzy inference system (ANFIS) for the prediction of stock market return: the case of the Istanbul stock exchange. Expert Syst Appl 37(12):7908–7912
Sorjamaa A, Hao J, Lendasse A (2005) Mutual information and K-nearest neighbors approximator for time series prediction. In: 15th International conference on artificial neural networks (ICANN 2005), Proceedings, Springer, Lecture notes in computer science, vol 3697, pp 553–558
Hornik K, Stinchcombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2(5):359–366
Acknowledgments
This work was supported in part by the Spanish Project TIN2007-60587 and the FPU research grant AP2007-03009. The authors also want to thank all the anonymous reviewers for their suggestions and comments.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Florido, J.P., Pomares, H., Rojas, I. et al. A deterministic model selection scheme for incremental RBFNN construction in time series forecasting. Neural Comput & Applic 21, 595–610 (2012). https://doi.org/10.1007/s00521-010-0466-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-010-0466-5