Abstract
The idea of decomposition methodology for classification tasks is to break down a complex classification task into several simpler and more manageable sub-tasks that are solvable by using existing induction methods, then joining their solutions together in order to solve the original problem. In this paper we provide an overview of very popular but diverse decomposition methods and introduce a related taxonomy to categorize them. Subsequently, we suggest using this taxonomy to create a novel meta-decomposer framework to automatically select the appropriate decomposition method for a given problem. The experimental study validates the effectiveness of the proposed meta-decomposer on a set of benchmark datasets.
Similar content being viewed by others
References
Ali KM, Pazzani MJ (1996) Error reduction through learning multiple descriptions. Mach Learn 24(3):173-202
Anand R, Methrotra K, Mohan CK, Ranka S (1995) Efficient classification for multiclass problems using modular neural networks. IEEE Trans Neural Netw 6(1):117-125
Baxt WG (1990) Use of an artificial neural network for data analysis in clinical decision making: the diagnosis of acute coronary occlusion. Neural Comput 2(4):480-489
Bay S (1999) Nearest neighbor classification from multiple feature subsets. Intell Data Anal 3(3):191-209
Bensusan H, Kalousis A (2001) Estimating the predictive accuracy of a classifier. In: Proceedings of the 12th European conference on machine learning, pp 25–36
Berry M, Linoff G (2000) Mastering data mining. Wiley
Bhargava HK (1999) Data mining by decomposition: adaptive search for hypothesis generation. INFORMS J Comput 11(3):239-247
Biermann AW, Faireld J, Beres T (1982) Signature table systems and learning. IEEE Trans Syst Man Cybern 12(5):635-648
Blum A, Mitchell T (1998) Combining labeled and unlabeled data with cotraining. In: Proceedings of the 11th annual conference on computational learning theory, pp 92–100
Breiman L (1996) Bagging predictors. Mach Learn 24(2):123-140
Buntine W (1996) Graphical models for discovering knowledge. In: Fayyad U, Piatetsky-Shapiro G, Smyth P, Uthurusamy R (eds) Advances in knowledge discovery and data mining. AAAI/MIT Press, pp 59–82
Chan PK, Stolfo SJ (1997) On the accuracy of meta-learning for scalable data mining. J Intell Inform Syst 8:5–28
Chen K, Wang L, Chi H (1997) Methods of combining multiple classifiers with different features and their applications to text-independent speaker identification. Intern J Pattern Recognit Artif Intell 11(3):417–445
Cherkauer KJ (1996) Human expert-level performance on a scientific image analysis task by a system using combined artificial neural networks. In: Notes, integrating multiple learned models for improving and scaling machine learning algorithms workshop, thirteenth national conference on artificial intelligence. AAAI Press, Portland
Dietterich TG, Ghulum Bakiri (1995) Solving multiclass learning problems via error-correcting output codes. J Artif Intell Res 2:263–286
Domingos P (1996) Using partitioning to speed up specific-to-general rule induction. In: Proceedings of the AAAI-96 workshop on integrating multiple learned models, AAAI Press, pp 29–34
Domingos P, Pazzani M (1997) On the optimality of the Naive Bayes classifier under zero-one loss. Mach Learn 29(2):103–130
Fischer B (1995) Decomposition of time series—comparing different methods in theory and practice, Eurostat Working Paper
Friedman JH (1991) Multivariate adaptive regression splines. Ann Stat 19:1–141
Fürnkranz J (1997) More efficient windowing. In: Proceeding of the 14th national conference on artificial intelligence (AAAI-97). AAAI Press, Providence pp 509–514
Gama J (2000) A linear-Bayes classifier. In: Monard C (eds) Advances on artificial intelligence—SBIA2000. LNAI 1952, Springer Verlag, pp 269–279
Giraud–Carrier C, Vilalta R, Brazdil P (2004) Introduction to the special issue of on meta-learning. Mach Learn 54(3):197–194
Guo Y, Sutiwaraphun J (1998) Knowledge probing in distributed data mining, In: Proceedings of the 4th international conference on knowledge discovery data mining, pp 61–69
Hampshire JB, Waibel A (1992) The meta-pi network—building distributed knowledge representations for robust multisource pattern-recognition. Pattern Anal Mach Intell 14(7): 751–769
Hansen J (2000) Combining predictors. Meta machine learning methods and bias, variance & ambiguity decompositions. Ph.D. dissertation. Aurhus University
He DW, Strege B, Tolle H, Kusiak A (2000) Decomposition in automatic generation of petri nets for manufacturing system control and scheduling. Int J Prod Res 38(6): 1437–1457
Holmstrom L, Koistinen P, Laaksonen J, Oja E (1997) Neural and statistical classifiers–taxonomy and a case study. IEEE Trans Neural Netw 8:5–17
Hrycej T (1992) Modular learning in neural networks. Wiley, New York
Hu X (2001) Using rough sets theory and database operations to construct a good ensemble of classifiers for data mining applications. ICDM01. pp 233–240
Jenkins R, Yuhas BP (1993) A simplified neural network solution through problem decomposition: the case of truck backer-upper. IEEE Trans Neural Netw 4(4):718–722
Johansen TA, Foss BA (1992) A narmax model representation for adaptive control based on local model—modeling. Identification Control 13(1):25–39
Jordan MI, Jacobs RA (1994) Hierarchical mixtures of experts and the EM algorithm. Neural Comput 6:181–214
Kargupta H, Chan P (eds) (2000) Advances in distributed and parallel knowledge discovery. AAAI/MIT Press, pp 185–210
Kohavi R, John G (1998) The wrapper approach. In: Liu H, Motoda H (eds) Feature extraction, construction and selection: a data mining perspective. Kluwer, Drecht
Kohavi R, Becker B, Sommerfield D (1997) Improving simple Bayes. In: Proceedings of the European conference on machine learning
Kononenko I (1990) Comparison of inductive and Naive Bayes learning approaches to automatic knowledge acquisition. In: Wielinga B (eds) Current trends in knowledge acquisition. The Netherlands IOS Press, Amsterdam
Kusiak A (2000) Decomposition in data mining: an industrial case study. IEEE Trans Electron Packag Manuf 23(4):345–353
Kusiak E Szczerbicki, Park K (1991) A novel approach to decomposition of design specifications and search for solutions. Int J Prod Res 29(7):1391–1406
Liao Y, Moody J (2000) Constructing heterogeneous committees via input feature grouping, In: Solla SA, Leen TK, Muller K-R (eds) Advances in neural information processing systems, vol12. MIT Press
Long C (2003) Bi-Decomposition of function sets using multi-valued logic, eng. doc. Dissertation, Technischen Universitat Bergakademie Freiberg
Lu BL, Ito M (1999) Task decomposition and module combination based on class relations: a modular neural network for pattern classification. IEEE Trans Neural Netw 10(5):1244–1256
Maimon O, Rokach L (2005) Decomposition methodology for knowledge discovery and data mining: theory and applications. World Scientific
Merz CJ, Murphy PM (1998) UCI repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine
Michie D (1995) Problem decomposition and the learning of skills. In: Proceedings of the European conference on machine learning, Springer, Berlin Heidelberg New York, pp 17–31
Nowlan SJ, Hinton GE (1991) Evaluation of adaptive mixtures of competing experts. In: Lippmann RP, Moody JE, Touretzky DS (eds) Advances in neural information processing systems, vol 3. Morgan Kaufmann Publishers Inc., pp 774–780
Ohno-Machado L, Musen MA (1997) Modular neural networks for medical prognosis: quantifying the benefits of combining neural networks for survival prediction. Connect Sci 9(1):71–86
Peng F, Jacobs RA, Tanner MA (1995) Bayesian inference in mixtures-of-experts and hierarchical mixtures-of-experts models with an application to speech recognition. J Am Stat Assoc
Pratt LY, Mostow J, Kamm CA (1991) Direct transfer of learned information among neural networks. In: Proceedings of the ninth national conference on artificial intelligence, Anaheim, CA, pp 584–589
Prodromidis AL, Stolfo SJ, Chan PK (1999) Effective and efficient pruning of metaclassifiers in a distributed data mining system. Technical report CUCS-017-99, Columbia University
Provost FJ, Kolluri V (1997) A survey of methods for scaling up inductive learning algorithms. In: Proceedings of the 3rd international conference on knowledge discovery and data mining
Quinlan JR (1993) C4.5: programs for machine learning. Morgan Kaufmann, Los Altos
Rahman AFR, Fairhurst MC (1997) A new hybrid approach in combining multiple experts to recognize handwritten numerals. Pattern Recognit Lett 18: 781–790
Ramamurti V, Ghosh J (1999) Structurally adaptive modular networks for non-stationary environments. IEEE Trans Neural Netw 10(1):152–160
R’enyi A (1970) Probability theory. North-Holland, Amsterdam
Ridgeway G, Madigan D, Richardson T, O’Kane J (1998) Interpretable boosted Naive Bayes classification. Proceedings of the fourth international conference on knowledge discovery and data mining, pp 101–104
Rokach L, Maimon O (2005) Feature set decomposition for decision trees. Intell Data Anal 6(2):1–28
Rokach L, Maimon O (2006) Decomposition methodology for classification tasks. In: Proceedings of the IEEE international conference on granular computing, Beijing, July 2005, IEEE Computer Society Press. ISBN: 0-7803-9017-2, pp 636–641
Rokach L, Maimon O (2006) Data mining for improving the quality of manufacturing: a feature set decomposition approach. J Intell Manuf 17(3):285–299
Rokach L, Maimon O, Lavi I (2003) Space decomposition in data mining: a clustering approach. In: Proceedings of the 14th international symposium on methodologies for intelligent systems, Maebashi, Japan. Lecture notes in computer science, Springer, Berlin Heidelberg, New York, pp 24–31
Rokach L, Maimon O, Arad O (2005) Improving supervised learning by sample decomposition. Int J Comput Intell Appl 5(1):37–54
Ronco E, Gollee H, Gawthrop PJ (1996) Modular neural network and self-decomposition. CSC Research Report CSC-96012, Centre for Systems and Control, University of Glasgow
Saaty X (1993) The analytic hierarchy process: a 1993 overview. Cent Eur J Oper Res Econ 2(2):119–137
Samuel A (1967) Some studies in machine learning using the game of checkers II: recent progress. IBM J Res Develop 11:601–617
Schaffer C (1993) Selecting a classification method by cross-validation. Mach Learn 13(1):135–143
Sharkey A (1996) On combining artificial neural nets. Connect Sci 8:299–313
Sharkey A (1999) Multi-net systems. In: Sharkey A. (eds) Combining artificial neural networks: ensemble and modular multi-net systems. Springer, Berlin Heidelberg New York, pp 1–30
Tsallis C (1988) Possible generalization of Boltzmann–Gibbs statistics. J Stat Phys 52:479–487
Tumer K, Ghosh J (1999) Linear and order statistics combiners for pattern classification. In: Sharkey A (ed) Combining artificial neural nets. Springer, Berlin Heidelberg New York, pp 127–162
Vilalta R, Giraud–Carrier C, Brazdil P (2005) Meta-learning. In: Maimon O, Rokach L (eds) Handbook of data mining and knowledge discovery in databases. Springer, Berlin Heidelberg New York, pp 731–748
Weigend AS, Mangeas M, Srivastava AN (1995) Nonlinear gated experts for time-series—discovering regimes and avoiding overfitting. Int J Neural Syst 6(5):373–399
Zaki MJ, Ho CT (eds) (2000) Large-scale parallel data mining. Springer, Berlin Heidelberg New York
Zaki MJ, Ho CT, Agrawal R (1999) Scalable parallel classification for data mining on shared- memory multiprocessors. In: Proceedings of the IEEE International Conference on Data Eng. WKDD99, Sydney, pp 198–20
Zupan B, Bohanec M, Demsar J, Bratko I (1998) Feature transformation by function decomposition. IEEE Intell Syst Appl 13: 38–43
Acknowledgment
The author would like to thank Professor Oded Maimon for his helpful and inspiring comments.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Rokach, L. Decomposition methodology for classification tasks: a meta decomposer framework. Pattern Anal Applic 9, 257–271 (2006). https://doi.org/10.1007/s10044-006-0041-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10044-006-0041-y