Skip to main content

Improvement of the Neural Network Trees through Fine-Tuning of the Threshold of Each Internal Node

  • Conference paper
  • 1427 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5863))

Abstract

Neural network tree (NNTree) is a decision tree (DT) in which each internal node contains a neural network (NN). Experimental results show that the performance of the NNTrees is usually better than that of the traditional univariate DTs. In addition, the NNTrees are more usable than the single model fully connected NNs because their structures can be determined automatically in the induction process. Recently, we proposed an algorithm that can induce the NNTrees efficiently and effectively. In this paper, we propose to improve the performance of the NNTrees further through fine-tuning of the threshold of each internal node. Experimental results on several public databases show that, although the proposed method is very simple, the performance of the NNTrees can be improved in most cases, and in some cases, the improvement is even significant.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Zhao, Q.F.: Evolutionary design of neural network tree - integration of decision tree, neural network and GA. In: Proc. IEEE Congress on Evolutionary Computation, pp. 240–244 (2001)

    Google Scholar 

  2. Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2(1), 1–32 (1994)

    MATH  Google Scholar 

  3. Hayashi, H., Zhao, Q.F.: A Fast Algorithm for Inducing Neural Network Trees. IPSJ Journal 49(8), 2878–2889 (2008) (in Japanese)

    Google Scholar 

  4. Breiman, L., Friedman, J.H., Olshen, R.A., Stong, C.J.: Classification and Regression Trees. Wadsworth Pub. Co. (1984)

    Google Scholar 

  5. Quinlan, J.: C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., San Francisco (1993)

    Google Scholar 

  6. Zhao, Q.F.: A New Method for Efficient Design of Neural Network Trees, Technical Report of IEICE, vol. PRMU2004-115, pp. 59–64 (2004)

    Google Scholar 

  7. Asuncion, A., Newman, D.J.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hayashi, H., Zhao, Q. (2009). Improvement of the Neural Network Trees through Fine-Tuning of the Threshold of Each Internal Node. In: Leung, C.S., Lee, M., Chan, J.H. (eds) Neural Information Processing. ICONIP 2009. Lecture Notes in Computer Science, vol 5863. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10677-4_75

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-10677-4_75

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-10676-7

  • Online ISBN: 978-3-642-10677-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics