Abstract
In supervised prediction problems, the response attribute depends on certain explanatory attributes. Some real problems require the response attribute to represent ordinal values that should increase with some of the explaining attributes. They are called classification problems with monotonicity constraints. In this paper, we aim at formalizing the approach to nested generalized exemplar learning with monotonicity constraints, proposing the monotonic nested generalized exemplar learning (MoNGEL) method. It accomplishes learning by storing objects in \({\mathbb {R}}^n\), hybridizing instance-based learning and rule learning into a combined model. An experimental analysis is carried out over a wide range of monotonic data sets. The results obtained have been verified by non-parametric statistical tests and show that MoNGEL outperforms well-known techniques for monotonic classification, such as ordinal learning model, ordinal stochastic dominance learner and k-nearest neighbor, considering accuracy, mean absolute error and simplicity of constructed models.
Similar content being viewed by others
References
Aha DW (ed) (1997) Lazy learning. Springer, New York
Aha DW, Kibler D, Albert MK (1991) Instance-based learning algorithms. Mach Learn 6(1):37–66
Alcala-Fdez J, Fernández A, Luengo J, Derrac J, García S, Sánchez L, Herrera F (2011) KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J Mult-Valued Log Soft Comput 17(2–3):255–287
Bache K, Lichman M (2013) UCI machine learning repository. http://archive.ics.uci.edu/ml. Accessed 21 May 2015
Ben-David A (1992) Automatic generation of symbolic multiattribute ordinal knowledge-based dsss: methodology and applications. Decis Sci 23:1357–1372
Ben-David A (1995) Monotonicity maintenance in information-theoretic machine learning algorithms. Mach Learn 19(1):29–43
Ben-David A, Sterling L, Pao YH (1989) Learning, classification of monotonic ordinal concepts. Comput Intel 5:45–49
Ben-David A, Sterling L, Tran T (2009) Adding monotonicity to learning algorithms may impair their accuracy. Expert Syst Appl 36(3):6627–6634
Cao-Van K (2003) Supervised ranking, from semantics to algorithms. Ph.D. dissertation, Ghent University, Ghent
Cao-Van K, Baets BD (2003) Growing decision trees in an ordinal setting. Int J Intel Syst 18(7):733–750
Chen CC, Li ST (2014) Credit rating with a monotonicity-constrained support vector machine model. Expert Syst Appl 41(16):7235–7247
Daniels H, Velikova M (2010) Monotone and partially monotone neural networks. IEEE Trans Neural Netw 21(6):906–917
Dembczyński K, Kotłowski W, Słowiński R (2009) Learning rule ensembles for ordinal classification with monotonicity constraints. Fundam Inform 94(2):163–178
Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30
Derrac J, García S, Herrera F (2014) Fuzzy nearest neighbor algorithms: taxonomy, experimental analysis and prospects. Inf Sci 260:98–119
Domingos P (1996) Unifying instance-based and rule-based induction. Mach Learn 24(2):141–168
Duivesteijn W, Feelders A (2008) Nearest neighbour classification with monotonicity constraints. In: ECML/PKDD (1), pp 301–316
Escalante HJ, Marin-Castro M, Morales-Reyes A, Graff M, Rosales-Pérez A, y Gómez MM, Reyes CA, González JA (2015) MOPG: a multi-objective evolutionary algorithm for prototype generation. Pattern Anal Appl. doi:10.1007/s10044-015-0454-6 (in press)
Feelders AJ, Pardoel M (2003) Pruning for monotone classification trees. In: IDA. Lecture notes in computer science, vol, 2810. Springer, New York, pp 1–12
Fernández-Navarro F, Riccardi A, Carloni S (2014) Ordinal neural networks without iterative tuning. IEEE Trans Neural Netw Learn Syst 25(11):2075–2085
Fürnkranz J (1999) Separate-and-conquer rule learning. Artif Intel Rev 13:3–54
García S, Herrera F (2008) An extension on “statistical comparisons of classifiers over multiple data sets” for all pairwise comparisons. J Mach Learn Res 9:2677–2694
García S, Fernández A, Luengo J, Herrera F (2010) Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf Sci 180(10):2044–2064
García S, Derrac J, Luengo J, Carmona CJ, Herrera F (2011) Evolutionary selection of hyperrectangles in nested generalized exemplar learning. Appl Soft Comput 11(3):3032–3045
García S, Derrac J, Triguero I, Carmona CJ, Herrera F (2012) Evolutionary-based selection of generalized instances for imbalanced classification. Knowl-Based Syst 25(1):3–12
García S, Luengo J, Herrera F (2015) Data preprocessing in data mining. Springer, New York
Gaudette L, Japkowicz N (2009) Evaluation methods for ordinal classification. In: Canadian conference on AI. Lecture notes in computer science, vol 5549, pp 207–210
Han J, Kamber M (2011) Data mining: concepts and techniques. Morgan Kaufmann Publishers Inc., San Francisco
Hu Q, Che X, Zhang L, Zhang D, Guo M, Yu D (2012) Rank entropy-based decision trees for monotonic classification. IEEE Trans Knowl Data Eng 24(11):2052–2064
Japkowicz N, Shah M (eds) (2011) Evaluating learning algorithms: a classification perspective. Cambridge University Press, Cambridge
Kotłowski W, Słowiński R (2009) Rule learning with monotonicity constraints. In: ICML, vol 382
Kotłowski W, Słowiński R (2013) On nonparametric ordinal classification with monotonicity constraints. IEEE Trans Knowl Data Eng 25(11):2576–2589
Lievens S, Baets BD, Cao-Van K (2008) A probabilistic framework for the design of instance-based supervised ranking algorithms in an ordinal setting. Ann Oper Res 163(1):115–142
Liu T, Moore AW, Gray A (2006) New algorithms for efficient high-dimensional nonparametric classification. J Mach Learn Res 7:1135–1158
López V, Fernández A, García S, Palade V, Herrera F (2013) An insight into classification with imbalanced data: empirical results and current trends on using data intrinsic characteristics. Inf Sci 250:113–141
Mariolis IG, Dermatas E (2013) Automatic classification of seam pucker images based on ordinal quality grades. Pattern Anal Appl 16(3):447–457
Potharst R, Feelders AJ (2002) Classification trees for problems with monotonicity constraints. SIGKDD Explor 4(1):1–10
Potharst R, Ben-David A, van Wezel MC (2009) Two algorithms for generating structured and unstructured monotone ordinal datasets. Eng Appl Artif Intel 22(4–5):491–496
Salzberg S (1991) A nearest hyperrectangle learning method. Mach Learn 6(3):251–276
Triguero I, Peralta D, Bacardit J, García S, Herrera F (2015) MRPR: a mapreduce solution for prototype reduction in big data classification. Neurocomputing 150:331–345
Wettschereck D, Dietterich TG (1995) An experimental comparison of the nearest-neighbor and nearest-hyperrectangle algorithms. Mach Learn 19(1):5–27
Wolpert DH (1996) The lack of a priori distinctions between learning algorithms. Neural Comput 8(7):1341–1390
Acknowledgments
The authors are very grateful to the anonymous reviewers for their valuable suggestions and comments to improve the quality of this paper.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
We declare that we have no conflict of interest.
Rights and permissions
About this article
Cite this article
García, J., Fardoun, H.M., Alghazzawi, D.M. et al. MoNGEL: monotonic nested generalized exemplar learning. Pattern Anal Applic 20, 441–452 (2017). https://doi.org/10.1007/s10044-015-0506-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10044-015-0506-y