Skip to main content

Rule Reduction over Numerical Attributes in Decision Trees Using Multilayer Perceptron

  • Conference paper
  • First Online:
Advances in Knowledge Discovery and Data Mining (PAKDD 2001)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2035))

Included in the following conference series:

Abstract

Many data sets show significant correlations between input variables, and much useful information is hidden in the data in a non-linear format. It has been shown that a neural network is better than a direct application of induction trees in modeling nonlinear characteristics of sample data. We have extracted a compact set of rules to support data with input variable relations over continuous-valued attributes. Those relations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. It is shown in this paper that variable thresholds play an important role in constructing linear classifier rules when we use a decision tree over linear classifiers extracted from a multilayer perceptron. We have tested this scheme over several data sets to compare it with the decision tree results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. R. Andrews, J. Diederich, and A.B. Tickle. A survey and critique of techniques for extracting rules from trained artificial neural networks. Knowledge-Based Systems, 8(6), 1996.

    Google Scholar 

  2. C. Blake, E. Keogh, and C.J. Merz. UCI repository of machine learning databases. In Preceedings of the Fifth International Conference on Machine Learning, http://www.ics.uci.edu/~mlearn, 1998.

  3. G. Cybenko. Continuous valued neural networks with two hidden layers are sufficient. Technical report, Technical Report, Department of Computer Science, Tufts University, Medford, MA, 1988.

    Google Scholar 

  4. T.G. Dietterich, H. Hild, and G. Bakiri. A comparative study of id3 and backpropagation for english text-to-speech mapping. In Proceedings of the 1990 Machine Learning Conference, pages 24–31. Austin, TX, 1990.

    Google Scholar 

  5. U.M. Fayyad and K.B. Irani. Multi-interval discretization of continuous-valued attributes for classification learning. In Proceedings of IJCAI’93, pages 1022–1027. Morgan Kaufmann, 1993.

    Google Scholar 

  6. D.H. Fisher and K.B. McKusick. An empirical comparison of id3_and backpropagation. In Proceedings of 11th International Joint Conference on AI, pages 788–793, 1989.

    Google Scholar 

  7. L. Fu. Rule learning by searching on adaptive nets. In Preceedings of the 9th National Conference on Artificial Intelligence, pages 590–595, 1991.

    Google Scholar 

  8. L. Fu. Neural Networks in Computer Intelligence. McGraw-Hill, New York, 1994.

    Google Scholar 

  9. S.S. Haykin. Neural networks: a comprehensive foundation. Prentice Hall, Upper Saddle River, N.J., 2nd edition, 1999.

    MATH  Google Scholar 

  10. J. Hertz, R.G. Palmer, and A.S. Krogh. Introduction to the Theory of Neural Computation. Addision Wesley, Redwood City, Calif., 1991.

    Google Scholar 

  11. K. Hornik, M. Stinchrombe, and H. White. Multilayer feedforward networks are universal approximators. Neural Networks, 2(5):359–366, 1989.

    Article  Google Scholar 

  12. K.B. Irani and Qian Z. Karsm: A combined response surface / knowledge acquisition approach for deriving rules for expert system. In TECHCON’90 Conference, pages 209–212, 1990.

    Google Scholar 

  13. D. Kim. Knowledge acquisition based on neural network modeling. Technical report, Directed Study, The University of Michigan, Ann Arbor, 1991.

    Google Scholar 

  14. D. Kim and J. Lee. Handling continuous-valued attributes in decision tree using neural network modeling. In European Conference on Machine Learning, Lecture Notes in Artificial Intelligence 1810, pages 211–219. Springer Verlag, 2000.

    Google Scholar 

  15. W. Kweldo and M. Kretowski. An evolutionary algorithm using multivariate discretization for decision rule induction. In Proceedings of the Third European Conference on Principles of Data Mining and Knowledge Discovery, pages 392–397. Springer, 1999.

    Google Scholar 

  16. J.R. Quinlan. Comparing connectionist and symbolic learning methods. In Computational Learning Theory and Natural Learning Systems, pages 445–456. MIT Press, 1994.

    Google Scholar 

  17. J.R. Quinlan. Improved use of continuous attributes in c4.5. Journal of Artificial Intelligence Approach, (4):77–90, 1996.

    Google Scholar 

  18. R. Setiono and Huan Lie. Symbolic representation of neural networks. Computer, 29(3):71–77, 1996.

    Article  Google Scholar 

  19. R. Setiono and H. Liu. Neurolinear: A system for extracting oblique decision rules from neural networks. In European Conference on Machine Learning, pages 221–233. Springer Verlag, 1997.

    Google Scholar 

  20. J.W. Shavlik, R.J. Mooney, and G.G. Towell. Symbolic and neural learning algorithms: An experimental comparison. Machine Learning, 6(2):111–143, 1991.

    Google Scholar 

  21. Ismail A. Taha and Joydeep Ghosh. Symbolic interpretation of artificial neural networks. IEEE Transactions on Knowledge and Data Engineering, 11(3):448–463, 1999.

    Article  Google Scholar 

  22. G.G. Towell and J.W. Shavlik. Extracting refined rules from knowledge-based neural networks. Machine Learning, 13(1):71–101, Oct. 1993.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kim, D., Lee, J. (2001). Rule Reduction over Numerical Attributes in Decision Trees Using Multilayer Perceptron. In: Cheung, D., Williams, G.J., Li, Q. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2001. Lecture Notes in Computer Science(), vol 2035. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45357-1_57

Download citation

  • DOI: https://doi.org/10.1007/3-540-45357-1_57

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-41910-5

  • Online ISBN: 978-3-540-45357-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics