Abstract
In the task of classification, most learning methods are suitable only for certain data types. For the hybrid dataset consists of nominal and numeric attributes, to apply the learning algorithms, some attributes must be transformed into the appropriate types. This procedure could damage the nature of dataset. We propose a model tree approach to integrate several characteristically different learning methods to solve the classification problem. We employ the decision tree as the classification framework and incorporate support vector machines into the tree construction process. This design removes the discretization procedure usually necessary for tree construction while decision tree induction itself can deal with nominal attributes which may not be handled well by e.g., SVM methods. Experiments show that our purposed method has better performance than that of other competing learning methods.
Research partially supported by Taiwan National Science Council Grant # 93-2213-E-011-036
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Dougherty, J., Kohavi, R., Sahami, M.: Supervised and unsupervised discretizations of continuous features. In: Proceedings of the 12th International Conference on Machine Learning, New York, pp. 194–202. Morgan Kaufmann, San Francisco (1995)
Fayyad, U.M., Irani, K.B.: Multi-interval discretization of continuous valued attributes for classification learning. In: Proceedings of the 13th International Joint Conference on Artificial Intelligence, pp. 1022–1029 (1993)
Gama, J., Torgo, L., Soares, C.: Dynamic discretization of continuous attributes. In: Coelho, H. (ed.) IBERAMIA 1998. LNCS (LNAI), vol. 1484, pp. 160–169. Springer, Heidelberg (1998)
Kohavi, R., Sahami, M.: Error-based and entropy-based discretization of continuous features. In: Proceedings of the Second International Conference on Knowledge Discovery and Data Mining (KDD 1996), pp. 114–119. AAAI Press, Menlo Park (1996)
Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth, Belmont (1984)
Quinlan, J.R.: C4.5 Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)
Brodley, C.E., Utgoff, P.E.: Multivariate decision trees. Machine Learning 19, 45–77 (1995)
Li, X.-B.: Multivariate decision trees using linear discriminants and tabu search. IEEE Transactions on Systems, Man and Cybernetics, Part A 33, 194–205 (2003)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, New York (2001)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of online learning and an application to boosting. J. of Comp. and Sys. Sciences 55, 119–139 (1997)
Quinlan, J.R.: Bagging, boosting, and c4.5. In: Proceedings of the Thirteenth National Conference on Artificial Intelligence and Eighth Innovative Applications of Artificial Intelligence Conference, AAAI 1996, pp. 725–730. AAAI Press, Menlo Park (1996)
Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)
Quinlan, J.R.: Improved use of continuous attributes in C4.5. Journal of Artificial Intelligence Research 4, 77–90 (1996)
Ittner, A., Schlosser, M.: Non-linear decision trees - NDT. In: Proc. of the 13th Inter. Conf. on Machine Learning (ICML 1996), pp. 252–257. Morgan Kaufmann, San Francisco (1996)
Bennett, K., Blue, J.: A support vector machine approach to decision trees (1997)
Heath, D., Kasif, S., Salzberg, S.: Induction of oblique decision trees. In: Proceedings of the 13th Inter. Joint Conf. on AI, San Mateo, CA, pp. 1002–1007. Morgan Kaufmann, San Francisco (1993)
Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2, 1–33 (1994)
Murthy, S.K., Kasif, S., Salzberg, S., Beigel, R.: OC1: Randomized induction of oblique decision trees. In: Proceedings of the Eleventh Nat. Conf. on AI, Washington, DC, pp. 322–327. MIT Press, Cambridge (1993)
Utgoff, P.E., Brodley, C.E.: Linear machine decision trees. Technical report, University of Massachusetts, COINS Technical Report 91-10 (1991)
Lee, Y.-J., Mangasarian, O.L.: SSVM: A smooth support vector machine. Computational Optimization and Applications 20, 5-22 Data Mining Institute, University of Wisconsin, Technical Report 99-03 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pao, HK., Chang, SC., Lee, YJ. (2005). Model Trees for Classification of Hybrid Data Types. In: Gallagher, M., Hogan, J.P., Maire, F. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2005. IDEAL 2005. Lecture Notes in Computer Science, vol 3578. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11508069_5
Download citation
DOI: https://doi.org/10.1007/11508069_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26972-4
Online ISBN: 978-3-540-31693-0
eBook Packages: Computer ScienceComputer Science (R0)