Abstract
Randomization ensemble creation technique well-known as Bagging is widely used to construct trained ensembles of base classifiers. The computational power and demand of Neural Networks (NNs) approved in both researches or in applications. The weight connections of the NNs holds the real ability for the NNs model to efficient performance. This paper aims to analyze the weight connections of the trained ensemble of NNs, as well as investigating their statistical parametric distributions via presenting a framework to estimate the best-fit distribution to the weight connections. As so far the presented work is the first attempt to explore and analyze the weight connections distribution of a trained ensemble of NNs. Obtained results proven that the T-location scale statistical distribution is approximately the best-fit to the weights of the trained NNs ensemble, consequently we aim in our future work to employ the outcomes to withdraw the weight connections value from approximated best-fit distribution instead of training the classifier from scratch.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)
Rogova, G.: Combining the results of several neural network classifiers. Neural Netw. 7(5), 777–781 (1994)
Giacinto, G., Roli, F., Fumera, G.: Design of effective multiple classifier systems by clustering of classifiers. In: Proceedings of 15th International Conference on Pattern Recognition, vol. 2, pp. 160–163. IEEE (2000)
Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley, Hoboken (2014)
Huang, G.-B., Saratchandran, P., Sundararajan, N.: A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation. IEEE Trans. Neural Netw. 16(1), 57–67 (2005)
Shi, B., Bai, X., Yao, C.: An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition. IEEE Trans. Pattern Anal. Mach. Intell. 39(11), 2298–2304 (2017)
El-Sayed, M.A., Khafagy, M.A.: An identification system using eye detection based on wavelets and neural networks. arXiv preprint arXiv:1401.5108 (2014)
Wu, J., Bai, X., Loog, M., Roli, F., Zhou, Z.-H.: Multi-instance learning in pattern recognition and vision (2017)
Zhou, Z.-H., Jianxin, W., Tang, W.: Ensembling neural networks: many could be better than all. Artif. Intell. 137(1), 239–263 (2002)
Bellido, I., Fiesler, E.: Do backpropagation trained neural networks have normal weight distributions? pp. 772–775. Springer, London (1993)
Barbour, B., Brunel, N., Hakim, V., Nadal, J.-P.: What can we learn from synaptic weight distributions? Trends Neurosci. 30(12), 622–629 (2007)
Gardner, E.: The space of interactions in neural network models. J. Phys. A Math. Gen. 21(1), 257 (1988)
Brunel, N., Hakim, V., Isope, P., Nadal, J.-P., Barbour, B.: Optimal information storage and the distribution of synaptic weights. Neuron 43(5), 745–757 (2004)
Langley, P., et al.: Selection of relevant features in machine learning. In: Proceedings of the AAAI Fall Symposium on Relevance, vol. 184, pp. 245–271 (1994)
Huang, G.B., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part B (Cybernetics) 42(2), 513–529 (2012)
Han, S., Pool, J., Tran, J., Dally, W.: Learning both weights and connections for efficient neural network. In: Proceedings of Advances in Neural Information Processing Systems, pp. 1135–1143 (2015)
Huang, G., Li, Y., Pleiss, G., Liu, Z., Hopcroft, J.E., Weinberger, K.Q.: Snapshot ensembles: train 1, get m for free. arXiv preprint arXiv:1704.00109 (2017)
Perthame, B., Salort, D., Wainrib, G.: Distributed synaptic weights in a LIF neural network and learning rules. Phys. D Nonlinear Phenom. (2017)
Santucci, E., Didaci, L., Fumera, G., Roli, F.: A parameter randomization approach for constructing classifier ensembles. Pattern Recogn. 69, 1–13 (2017)
Ahmed, M.A.O., Didaci, L., Fumera, G., Roli, F.: An empirical investigation on the use of diversity for creation of classifier ensembles. In: International Workshop on Multiple Classifier Systems, pp. 206–219. Springer (2015)
Akaike, H.: A new look at the statistical model identification. IEEE Trans. Autom. Control 19(6), 716–723 (1974)
Schwarz, G., et al.: Estimating the dimension of a model. Ann. Stat. 6(2), 461–464 (1978)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Ahmed, M.A.O. (2018). Trained Neural Networks Ensembles Weight Connections Analysis. In: Hassanien, A., Tolba, M., Elhoseny, M., Mostafa, M. (eds) The International Conference on Advanced Machine Learning Technologies and Applications (AMLTA2018). AMLTA 2018. Advances in Intelligent Systems and Computing, vol 723. Springer, Cham. https://doi.org/10.1007/978-3-319-74690-6_24
Download citation
DOI: https://doi.org/10.1007/978-3-319-74690-6_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-74689-0
Online ISBN: 978-3-319-74690-6
eBook Packages: EngineeringEngineering (R0)