Skip to main content

Model Trees for Classification of Hybrid Data Types

  • Conference paper
Intelligent Data Engineering and Automated Learning - IDEAL 2005 (IDEAL 2005)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 3578))

Abstract

In the task of classification, most learning methods are suitable only for certain data types. For the hybrid dataset consists of nominal and numeric attributes, to apply the learning algorithms, some attributes must be transformed into the appropriate types. This procedure could damage the nature of dataset. We propose a model tree approach to integrate several characteristically different learning methods to solve the classification problem. We employ the decision tree as the classification framework and incorporate support vector machines into the tree construction process. This design removes the discretization procedure usually necessary for tree construction while decision tree induction itself can deal with nominal attributes which may not be handled well by e.g., SVM methods. Experiments show that our purposed method has better performance than that of other competing learning methods.

Research partially supported by Taiwan National Science Council Grant # 93-2213-E-011-036

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Dougherty, J., Kohavi, R., Sahami, M.: Supervised and unsupervised discretizations of continuous features. In: Proceedings of the 12th International Conference on Machine Learning, New York, pp. 194–202. Morgan Kaufmann, San Francisco (1995)

    Google Scholar 

  2. Fayyad, U.M., Irani, K.B.: Multi-interval discretization of continuous valued attributes for classification learning. In: Proceedings of the 13th International Joint Conference on Artificial Intelligence, pp. 1022–1029 (1993)

    Google Scholar 

  3. Gama, J., Torgo, L., Soares, C.: Dynamic discretization of continuous attributes. In: Coelho, H. (ed.) IBERAMIA 1998. LNCS (LNAI), vol. 1484, pp. 160–169. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  4. Kohavi, R., Sahami, M.: Error-based and entropy-based discretization of continuous features. In: Proceedings of the Second International Conference on Knowledge Discovery and Data Mining (KDD 1996), pp. 114–119. AAAI Press, Menlo Park (1996)

    Google Scholar 

  5. Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)

    Article  Google Scholar 

  6. Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    MATH  Google Scholar 

  7. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth, Belmont (1984)

    MATH  Google Scholar 

  8. Quinlan, J.R.: C4.5 Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  9. Brodley, C.E., Utgoff, P.E.: Multivariate decision trees. Machine Learning 19, 45–77 (1995)

    MATH  Google Scholar 

  10. Li, X.-B.: Multivariate decision trees using linear discriminants and tabu search. IEEE Transactions on Systems, Man and Cybernetics, Part A 33, 194–205 (2003)

    Article  Google Scholar 

  11. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, New York (2001)

    MATH  Google Scholar 

  12. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of online learning and an application to boosting. J. of Comp. and Sys. Sciences 55, 119–139 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  13. Quinlan, J.R.: Bagging, boosting, and c4.5. In: Proceedings of the Thirteenth National Conference on Artificial Intelligence and Eighth Innovative Applications of Artificial Intelligence Conference, AAAI 1996, pp. 725–730. AAAI Press, Menlo Park (1996)

    Google Scholar 

  14. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  15. Quinlan, J.R.: Improved use of continuous attributes in C4.5. Journal of Artificial Intelligence Research 4, 77–90 (1996)

    MATH  Google Scholar 

  16. Ittner, A., Schlosser, M.: Non-linear decision trees - NDT. In: Proc. of the 13th Inter. Conf. on Machine Learning (ICML 1996), pp. 252–257. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  17. Bennett, K., Blue, J.: A support vector machine approach to decision trees (1997)

    Google Scholar 

  18. Heath, D., Kasif, S., Salzberg, S.: Induction of oblique decision trees. In: Proceedings of the 13th Inter. Joint Conf. on AI, San Mateo, CA, pp. 1002–1007. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  19. Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2, 1–33 (1994)

    MATH  Google Scholar 

  20. Murthy, S.K., Kasif, S., Salzberg, S., Beigel, R.: OC1: Randomized induction of oblique decision trees. In: Proceedings of the Eleventh Nat. Conf. on AI, Washington, DC, pp. 322–327. MIT Press, Cambridge (1993)

    Google Scholar 

  21. Utgoff, P.E., Brodley, C.E.: Linear machine decision trees. Technical report, University of Massachusetts, COINS Technical Report 91-10 (1991)

    Google Scholar 

  22. Lee, Y.-J., Mangasarian, O.L.: SSVM: A smooth support vector machine. Computational Optimization and Applications 20, 5-22 Data Mining Institute, University of Wisconsin, Technical Report 99-03 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pao, HK., Chang, SC., Lee, YJ. (2005). Model Trees for Classification of Hybrid Data Types. In: Gallagher, M., Hogan, J.P., Maire, F. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2005. IDEAL 2005. Lecture Notes in Computer Science, vol 3578. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11508069_5

Download citation

  • DOI: https://doi.org/10.1007/11508069_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26972-4

  • Online ISBN: 978-3-540-31693-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics