Abstract
Random neural networks (RNNs) prediction model is built with a specific randomized algorithm by employing a single hidden layer structure. Duo to input weights and biases are randomly assigned and output weights are analytically calculated, it is widely used in different applications. Most of RNNs-based soft measuring models assign the random parameter scope to default range [− 1, 1]. However, this cannot ensure the universal approximation capability of the resulting model. In this paper, selective ensemble (SEN)-RNN algorithm based on adaptive selection scope of input weights and biases is proposed to construct soft measuring model. Bootstrap and genetic algorithm optimization toolbox are used to construct a set of SEN-RNN models with different random parameter scope. The final soft measuring model is adaptive selected in terms of the best generation performance among these SEN models. Simulation results based on housing benchmark dataset of UCI and dioxin concentration dataset of municipal solid waste incineration validate the proposed approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chen, T., Chen, H.: Universal approximation of nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems. IEEE Trans. Neural Netw. 6(4), 321–355 (1995)
Pao, Y.H., Takefuji, Y.: Functional-link net computing: theory, system architecture, and functionalities. Computer 25(5), 76–79 (1992)
Pao, Y.H., Park, G.H., Sobajic, D.J.: Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 6(2), 163–180 (1994)
Dehuri, S., Cho, S.B.: A comprehensive survey on functional link neural networks and an adaptive PSO-BP learning for CFLNN. Neural Comput. Appl. 19(2), 187–205 (2010)
Igelnik, B., Pao, Y.H.: Stochastic choice of basis functions in adaptive function approximation and the functional-link net. IEEE Trans. Neural Netw. 6(6), 1320–1329 (1995)
Li, M., Wang, D.H.: Insights into randomized algorithm for neural networks: practical issues and common pitfalls. Inf. Sci. 382–382, 170–178 (2017)
Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Comput. Soc. 12, 993–1001 (1990)
Granitto, P.M., Verdes, P.F., Ceccatto, H.A.: Neural networks ensembles: evaluation of aggregation algorithms. Artif. Intell. 163, 139–162 (2005)
Zhou, Z.H., Wu, J., Tang, W.: Ensembling neural networks: many could be better than all. Artif. Intell. 137(1–2), 239–263 (2002)
Tang, J., Wu, Z.W., Zhang, J., Chai, T.Y., Liu, Z., Yu, W.: Modeling collinear data using double-layer GA-based selective ensemble kernel partial least squares algorithm. Neurocomputing 219, 248–262 (2017)
Zhou, Z., Zhao, B., Kojima, H., Takeuchi, S., Takagi, Y., Tateishi, N.: Simple and rapid determination of PCDD/FS in flue gases from various waste incinerators in China using DR-Ecoscreen cells. Chemosphere 102(1), 24–30 (2014)
Chang, N.B., Huang, S.H.: Statistical modelling for the prediction and control of PCDDS and PCDFS emissions from municipal solid waste incinerators. Waste Manage. Res. 13(4), 379–400 (1995)
Acknowledgment
This work is partially supported by the National Natural Science Foundation of China (61640308, 61573364, 61503066, and 61573249), state Key Laboratory of Process Automation in Mining & Metallurgy and Beijing Key Laboratory of Process Automation in Mining & Metallurgy.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Tang, J., Qiao, J., Wu, Z., Zhang, J., Yan, A. (2017). Selective Ensemble Random Neural Networks Based on Adaptive Selection Scope of Input Weights and Biases for Building Soft Measuring Model. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10634. Springer, Cham. https://doi.org/10.1007/978-3-319-70087-8_60
Download citation
DOI: https://doi.org/10.1007/978-3-319-70087-8_60
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70086-1
Online ISBN: 978-3-319-70087-8
eBook Packages: Computer ScienceComputer Science (R0)