Abstract
While there are many methods in classifier ensemble, there is not any method which uses weighting in class level. Random Forest which uses decision trees for problem solving is the base of our proposed ensemble. In this work, we propose a weightening based classifier ensemble method in class level. The proposed method is like Random Forest method in employing decision tree and neural networks as classifiers, and differs from Random Forest in employing a weight vector per classifier. For evaluating the proposed weighting method, both ensemble of decision tree and neural networks classifiers are applied in experimental results. Main presumption of this method is that the reliability of the predictions of each classifier differs among classes. The proposed ensemble methods were tested on a huge Persian data set of handwritten digits and have improvements in comparison with competitors.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Breiman, L.: Random forests. MachineLearning 45, 5–32 (2001)
Davis, L.: Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York (1991)
Dietterich, T.G.: Ensemble learning. In: Arbib, M.A. (ed.) The Handbook of Brain Theory and Neural Networks, 2nd edn. MIT Press, Cambridge (2002)
Dimililer, N., Varoğlu, E., Altınçay, H.: Vote-Based Classifier Selection for Biomedical NER Using Genetic Algorithms. In: Martí, J., Benedí, J.M., Mendonça, A.M., Serrat, J. (eds.) IbPRIA 2007. LNCS, vol. 4478, pp. 202–209. Springer, Heidelberg (2007)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, NY (2001)
Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman & Hall, New York (1993)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of online learning and an application to boosting. In: Vitányi, P.M.B. (ed.) EuroCOLT 1995. LNCS, vol. 904, pp. 23–37. Springer, Heidelberg (1995)
Khosravi, H., Kabir, E.: Introducing a very large dataset of handwritten Farsi digits and a study on the variety of handwriting styles. Pattern Recognition Letters 28(10), 1133–1141 (2007)
Sanchez, A., Alvarez, R., Moctezuma, J.C., Sanchez, S.: Clustering and Artificial Neural Networks as a Tool to Generate Membership Functions. In: Proceedings of the 16th IEEE International Conference on Electronics, Communications and Computers (2006)
Zhou, Z.H., Wu, J.X., Jiang, Y., Chen, S.F.: Genetic Algorithm based Selective Neural Network Ensemble. In: Proceedings of the 17th International Joint Conference on Artificial Intelligence (IJCAI 2001), Seattle, WA, vol. 2, pp. 797–802 (2001)
Zhou, Z.H., Wu, J.X., Tang, W.: Ensembling Neural Networks: Many Could Be Better Than All. Artificial Intelligence 137(1-2), 239–263 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Parvin, H., Minaei, B., Alizadeh, H., Beigi, A. (2011). A Novel Classifier Ensemble Method Based on Class Weightening in Huge Dataset. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds) Advances in Neural Networks – ISNN 2011. ISNN 2011. Lecture Notes in Computer Science, vol 6676. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21090-7_17
Download citation
DOI: https://doi.org/10.1007/978-3-642-21090-7_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21089-1
Online ISBN: 978-3-642-21090-7
eBook Packages: Computer ScienceComputer Science (R0)