Abstract
Interval data offer a valuable way of representing the available information in complex problems where uncertainty, inaccuracy, or variability must be taken into account. Considered in this paper is the learning of interval neural networks, of which the input and output are vectors with interval components, and the weights are real numbers. The back-propagation (BP) learning algorithm is very slow for interval neural networks, just as for usual real-valued neural networks. Extreme learning machine (ELM) has faster learning speed than the BP algorithm. In this paper, ELM is applied for learning of interval neural networks, resulting in an interval extreme learning machine (IELM). There are two steps in the ELM for usual feedforward neural networks. The first step is to randomly generate the weights connecting the input and the hidden layers, and the second step is to use the Moore–Penrose generalized inversely to determine the weights connecting the hidden and output layers. The first step can be directly applied for interval neural networks. But the second step cannot, due to the involvement of nonlinear constraint conditions for IELM. Instead, we use the same idea as that of the BP algorithm to form a nonlinear optimization problem to determine the weights connecting the hidden and output layers of IELM. Numerical experiments show that IELM is much faster than the usual BP algorithm. And the generalization performance of IELM is much better than that of BP, while the training error of IELM is a little bit worse than that of BP, implying that there might be an over-fitting for BP.
Similar content being viewed by others
References
Moore RE (1966) Interval analysis. Prentice-Hall, Englewood Cliffs
Sunaga T (1958) Theory of an interval algebra and its applications to numerical analysis. RAAG Mem 2:29–46
Yeung DS, Ng WWY, Wang DF, Tsang ECC, Wang XZ (2007) Localized generalization error model and its application to architecture selection for radial basis function neural network. IEEE Trans Neural Netw 18:1294–1305
Tsang ECC, Wang XZ, Yeung DS (2000) Improving learning accuracy of fuzzy decision trees by hybrid neural networks. IEEE Trans Fuzzy Syst 8:601–614
Ishibuchi H, Tanaka H (1991) An extension of the BP algorithm to interval input vectors. In: Proceedings of IJCNN’91, Singapore, pp 1588–1593
Hemandez CA, Espf J, Nakayama K (1993) Interval arithmetic backpropagation. In: Proceedings of 1993 international joint conference on neural networks, Nagoya, pp 375–378
Roque AMC, Mate C, Arroyo J, Sarabia A (2007) iMLPApplying multi-layer perceptrons to interval-valued data. Neural Process Lett 25:157–169
Chetwynd D, Worden K, Manson G (2006) An application of interval-valued neural networks to a regression problem. Proc R Soc A: Math Phys Eng Sci 462:3097–3114
Jeng JT, Chuang CC, Su SF (2003) Support vector interval regression networks for interval regression analysis. Fuzzy Sets Syst 138:283–300
Jiang FF, Shen JH, Li XD (2013) The LMI method for stationary oscillation of interval neural networks with three neuron activations under impulsive effects. Nonlinear Anal-Real World Appl 14(3):1404–1416
Yang DK, Wu W (2012) A smoothing interval neural network. Discret Dyn Nat Soc 2012. doi:10.1155/2012/456919
Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagation errors. Nat Biotechnol 323:533–536
Rumelhart DE, McClelland JL et al (1986) Parallel distributed processing. The MIT Press, Cambridge
Cao JW, Lin ZP, Huang GB, Liu N (2012) Voting based extreme learning machine. Inf Sci 185:66–77
Zhai JH, Xu HY, Wang XZ (2012) Dynamic ensemble extreme learning machine based on sample entropy. Soft Comput 16:1493–1502
He Q, Shang TF, Zhuang FZ, Shi ZZ (2013) Parallel extreme learning machine for regression based on MapReduce. Neurocomputing 102:52–58
Chacko BP, Vimal Krishnan VR, Raju G, Babu Anto P (2012) Handwritten character recognition using wavelet energy and extreme learning machine. Int J Mach Learn Cybern 3:149–161
Wu J, Wang ST, Chung FL (2011) Positive and negative fuzzy rule system, extreme learning machine and image classification. Int J Mach Learn Cybern 2:261–271
Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501
Huang GB, Wang DH (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2:107–122
Huang GB, Zhou HM, Ding XJ, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B Cybern 42:513–529
Shao HM, Zheng GF (2011) Convergence analysis of a back-propagation algorithm with adaptive momentum. Neurocomputing 74:749–752
Wu W, Wang J, Cheng MS, Li ZX (2011) Convergence analysis of online gradient method for BP neural networks. Neural Netw 24:91–98
Wang J, Yang J, Wu W (2011) Convergence of cyclic and almost-cyclic learning with momentum for feedforward neural networks. IEEE Trans Neural Netw 22:1297–1306
Acknowledgments
This work is supported by the National Natural Science Foundation of China (11171367) and the Fundamental Research Funds for the Central Universities of China.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Yang, D., Li, Z. & Wu, W. Extreme learning machine for interval neural networks. Neural Comput & Applic 27, 3–8 (2016). https://doi.org/10.1007/s00521-013-1519-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-013-1519-3