Abstract
In classification problems with ordinal monotonic constraints, the class variable should raise in accordance with a subset of explanatory variables. Models generated by standard classifiers do not guarantee to fulfill these monotonicity constraints. Therefore, some algorithms have been designed to deal with these problems. In the particular case of the decision trees, the growing and pruning mechanisms have been modified in order to produce monotonic trees. Recently, also ensembles have been adapted toward this problem, providing a good trade-off between accuracy and monotonicity degree. In this paper we study the behaviour of these decision tree mechanisms built on an AdaBoost scheme. We combine these techniques with a simple ensemble pruning method based on the degree of monotonicity. After an exhaustive experimental analysis, we deduce that the AdaBoost achieves a better predictive performance than standard algorithms, while holding also the monotonicity restriction.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ben-David, A., Sterling, L., Pao, Y.H.: Learning, classification of monotonic ordinal concepts. Comput. Intell. 5, 45–49 (1989)
Kotłowski, W., Słowiński, R.: On nonparametric ordinal classification with monotonicity constraints. IEEE Trans. Knowl. Data Eng. 25, 2576–2589 (2013)
Furnkranz, J., Gamberger, D., Lavrac, N.: Foundations of Rule Learning. Springer, Berlin (2012)
Rokach, L., Maimon, O.: Data Mining with Decision Trees: Theory and Applications, 2nd edn. World Scientific, River Edge (2014)
Garca, J., Fardoun, H., Alghazzawi, D., Cano, J.R., Garca, S.:Mongel: monotonic nested generalized exemplar learning. Pattern Anal. Appl. 1–12 (2015)
Wozniak, M., Graña, M., Corchado, E.: A survey of multiple classifier systems as hybrid systems. Inf. Fusion 16, 3–17 (2014)
Sousa, R., Cardoso, J.: Ensemble of decision trees with global constraints for ordinal classification. In: 2011 11th International Conference on Intelligent Systems Design and Applications (ISDA), pp. 1164–1169 (2011)
González, S., Herrera, F., García, S.: Monotonic random forest with an ensemble pruning mechanism based on the degree of monotonicity. New Gener. Comput. 33, 367–388 (2015)
Martínez-Muñoz, G., Hernández-Lobato, D., Suárez, A.: An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans. Pattern Anal. Mach. Intell. 31, 245–259 (2009)
Dembczyński, K., Kotłowski, W., Słowiński, R.: Ensemble of decision rules for ordinal classification with monotonicity constraints. In: Wang, G., Li, T., Grzymala-Busse, J.W., Miao, D., Skowron, A., Yao, Y. (eds.) RSKT 2008. LNCS (LNAI), vol. 5009, pp. 260–267. Springer, Heidelberg (2008)
Dembczyński, K., Kotłowski, W., Słowiński, R.: Learning rule ensembles for ordinal classification with monotonicity constraints. Fundamenta Informaticae 94, 163–178 (2009)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55, 119–139 (1997)
Ben-David, A.: Monotonicity maintenance in information-theoretic machine learning algorithms. Mach. Learn. 19, 29–43 (1995)
Webb, G.: Multiboosting: a technique for combining boosting and wagging. Mach. Learn. 40, 159–196 (2000)
Alcala-Fdez, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.: KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Logic Soft Comput. 17, 255–287 (2011)
Duivesteijn, W., Feelders, A.: Nearest neighbour classification with monotonicity constraints. In: Daelemans, W., Goethals, B., Morik, K. (eds.) ECML PKDD 2008, Part I. LNCS (LNAI), vol. 5211, pp. 301–316. Springer, Heidelberg (2008)
Xia, F., Zhang, W., Li, F., Yang, Y.: Ranking with decision tree. Knowl. Inf. Syst. 17, 381–395 (2008)
Japkowicz, N., Shah, M. (eds.): Evaluating Learning Algorithms: A Classification Perspective. Cambridge University Press, Cambridge (2011)
García, S., Fernández, A., Luengo, J., Herrera, F.: Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf. Sci. 180, 2044–2064 (2010)
Acknowledgments
This work was partially supported by the Spanish Ministry of Science and Technology under project TIN2014-57251-P and the Andalusian Research Plans P11-TIC-7765, P10-TIC-6858.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
González, S., Herrera, F., García, S. (2016). Managing Monotonicity in Classification by a Pruned AdaBoost. In: Martínez-Álvarez, F., Troncoso, A., Quintián, H., Corchado, E. (eds) Hybrid Artificial Intelligent Systems. HAIS 2016. Lecture Notes in Computer Science(), vol 9648. Springer, Cham. https://doi.org/10.1007/978-3-319-32034-2_43
Download citation
DOI: https://doi.org/10.1007/978-3-319-32034-2_43
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-32033-5
Online ISBN: 978-3-319-32034-2
eBook Packages: Computer ScienceComputer Science (R0)