Abstract
In neural network ensemble, the diversity of its constitutive component networks is a crucial factor to boost its generalization performance. In terms of how each ensemble system solves the problem, we can roughly categorize the existing ensemble mechanism into two groups: data-driven and model-driven ensembles. The former engenders diversity to ensemble members by manipulating the data, while the latter realizes ensemble diversity by manipulating the component models themselves. Within a neural network ensemble, standard back-propagation (BP) networks are usually used as a base component. However, in this article, we will use our previously designed improved circular back-propagation (ICBP) neural network to establish such an ensemble. ICBP differentiates from BP network not only because an extra anisotropic input node is added, but also more importantly, because of the introduction of the extra node, it possesses an interesting property apart from the BP network, i.e., just through directly assigning different sets of values 1 and −1 to the weights connecting the extra node to all the hidden nodes, we can construct a set of heterogeneous ICBP networks with different hidden layer activation functions, among which we select four typical heterogeneous ICBPs to build a dynamic classifier selection ICBP system (DCS-ICBP). The system falls into the category of model-driven ensemble. The aim of this article is to explore the relationship between the explicitly constructed ensemble and the diversity scale, and further to verify feasibility and effectiveness of the system on classification problems through empirical study. Experimental results on seven benchmark classification tasks show that our DCS-ICBP outperforms each individual ICBP classifier and surpasses the performance of combination of ICBP using the majority voting technique, i.e. majority voting ICBP system (MVICBP). The successful simulation results validate that in DCS-ICBP we provide a new constructive method for diversity enforcement for ICBP ensemble systems.







Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12(10):993–1001
Zhou ZH, Wu J, Tang W (2002) Ensembling neural networks: many could be better than all. Artif Intell 137(1–2):239–263
Sollich P, Krogh A (1996) Learning with ensembles: how over-fitting can be useful. In: Touretzky DS, Mozer MC, Hasselmo ME et al (eds) Advances in neural information processing systems 8. MIT Press, Cambridge, pp 190–196
Gutta S, Wechsler H (1996) Face recognition using hybrid classifier systems. In: Proceedings of ICNN-96. IEEE Computer Society Press, Los Alamitos, pp 1017–1022
Huang FJ, Zhou Z-H, Zhang H-J, Chen TH (2000) Pose invariant face recognition. In: Proceedings of 4th IEEE international conference on automatic face and gesture recognition. IEEE Computer Society Press, Grenoble, pp 245–250
Drucker H, Schapire R, Simard P (1993) Improving performance in neural networks using a boosting algorithm. In: Hanson SJ, Cowan JD, Giles CL (eds), Advances in neural information processing systems 5. Morgan Kaufmann, Denver, pp 42–49
Hansen LK, Liisberg L, Salamon P (1992) Ensemble methods for handwritten digit recognition. In: Proceedings of IEEE workshop on neural networks for signal processing. IEEE Press, Helsingoer, pp 333–342
Mao J (1998) A case study on bagging, boosting and basic ensembles of neural networks for OCR. In: Proceedings of IJCNN-98, vol 3. IEEE Computer Society Press, Anchorage, pp 1828–1833
Cherkauer KJ (1996) Human expert level performance on a scientific image analysis task by a system using combined artificial neural networks. In: Chan P, Stolfo S, Wolpert D (eds) Proceedings of AAAI-96 workshop on integrating multiple learned models for improving and scaling machine learning algorithms. AAAI Press, Portland, pp 15–21
Cunningham P, Carney J, Jacob S (2000) Stability problems with artificial neural networks and the ensemble solution. Artif Intell Med 20(3):217–225
Breiman L (1996) Bagging predictors. Machine Learning 24(2):123–140
Efron B, Tibshirani R (1993) An introduction to the Bootstrap. Chapman & Hall, New York
Schapire RE (1990) The strength of weak learnability. Mach Learn 5(2):197–227
Clemen RT, Winkler RL (1985) Limits for the precision and value of information from dependent sources. Oper Res 33:427–442
Perrone M, Cooper LN (1993) When networks disagree: ensemble methods for hybrid neural networks. In: Mammone RJ (ed) Neural networks for speech and image processing. Chapman & Hall, London
Liu Y, Yao X (1999) Simultaneous training of negatively correlated neural networks in an ensemble. IEEE Trans Syst Man Cybern B Cybern 29:716–725
Chan ZSH, Kasabov N (2005) Fast neural network ensemble learning via negative-correlation. IEEE Trans Neural Netw 16(6):1707–1710
Domany E (1999) Superparamagnetic clustering of data—the definitive solution of an ill-posed problem. Phys A 263(1–4):158–169
Yuansong Liao, John Moody (1999) Constructing heterogeneous committees using input feature grouping: application to economic forecasting. In: Advance in neural information processing system 1999:921–927
Xu L, Krzyzak A, Suen CY (1992) Methods for combining multiple classifiers and their applications to handwriting recognition. IEEE Trans Syst Man Cybern 22(3):418–435
Giacinto G, Roli F, Fumera G (1997) Adaptive selection of image classifiers, image analysis and processing. In: Proceedings of the 9th international conference, ICIAP’97, vol 1, Florence, 17–19 September 1997, pp 38–45
Giacinto G, Roli F (2000) A theoretical framework for dynamic classifier selection. In: International conference on pattern recognition (ICPR’00), vol II. IEEE Computer Society, Barcelona, 3–8 September 2000, pp 2008–2011
Didaci L, Giacinto G (2004) Dynamic classifier selection by adaptive k-nearest-neighbourhood rule, multiple classifier systems. In: Proceedings of 5th international workshop, MCS 2004, Cagliari, 9–11 June 2004, pp 174–183
Woods K, Kegelmeyer WP, Bowyer K (1997) Combination of multiple classifiers using local accuracy estimates. IEEE Trans Pattern Anal Mach Intell 19(4):405–410
Ridella S, Rovetta S, Zunino R (1997) Circular back-propagation networks for classification. IEEE Trans Neural Netw 8(1):84–97
Ridella S, Rovetta S, Zunino R (1999) Circular back-propagation networks embed vector quantization. IEEE Trans Neural Netw 10(4):972–975
Ridella S, Rovetta S, Zunino R (1997) CBP networks as a generalized neural model. Int Conf Neural Netw 210–214
Dai Q, Chen SC, Zhang BZ (2003) Improved CBP neural network model with applications in time series prediction. Neural Process Lett 18:197–211
Dai Q, Chen SC (2003) The research on multi-step and robust time series prediction techniques based on ICBP networks, Thesis of Master degree, Computer Science Department of Nanjing University of Aeronautics and Astronautics, March 2003
Zhang BZ (2001) The research on the performance and applications of improved BP neural networks, Thesis of Master degree, Computer Science Department of Nanjing University of Aeronautics and Astronautics, February 2001
Zhang BZ, Chen SC (2001) Equivalence between vector quantization and ICBP networks. Journal of Data Acquis Process (in Chinese) 16(3):291–294
Zhang BZ, Chen SC (2001) The equivalence between ICBP and the Bayesian classifier, Tech. Report No. 021, Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics
Haykin S (1999) Neural networks. A comprehensive foundation. Prentice-Hall, Englewood Cliffs
Hastie T, Tibshirani R (1996) Discriminant adaptive nearest neighbor classification. IEEE Trans Pattern Anal Mach Intell 18(6):607–615
Weijters A (1995) The BP-SOM architecture and learning rule. Neural Process Lett 2(6):13–16
Weijters A, Bosch VD, Herik HJ (1997) Intelligible neural networks with BP-SOM. Marcke and Daelemans, pp 27–36
Eggermont J (1998) Rule-extraction and learning in the BP-SOM architecture. Thesis of master degree, Computer Science Department of Leiden University
Prechelt L, Proben1 (1994) A set of neural network benchmark problems and benchmarking rules. Technical Report 24/94, Fakultat fur informatic, University Karlsruhe, Germany
UCI repository of machine learning databases available at http://www.ics.uci.edu/~mlearn/MLRepository.html or ftp.ics.uci.edu:pub/machine-learning-databases
Acknowledgments
We thank the National Natural Science Foundation of China under Grant No. 60773061; Jiangsu “QingLan” Project Foundation; the Returnee’s Foundation of China Scholarship Council; Jiangsu Ph.D Students Innovative Foundation under Grant No. BCXJ05-05, respectively.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Dai, Q. The build of a dynamic classifier selection ICBP system and its application to pattern recognition. Neural Comput & Applic 19, 123–137 (2010). https://doi.org/10.1007/s00521-009-0263-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-009-0263-1