Skip to main content

Flexible Neural Tree for Pattern Recognition

  • Conference paper
Advances in Neural Networks - ISNN 2006 (ISNN 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3971))

Included in the following conference series:

Abstract

This paper presents a novel induction model named Flexible Neural Tree (FNT) for pattern recognition. FNT uses decision tree to do basic analysis and neural network to do subsequent quantitative analysis. The Pure Information Gain I(X i ;ϑ), which is defined as test selection measure for FNT to construct decision tree, can be used to handle continuous attributes directly. When the information embodied by neural network node can show new attribute relations, FNT extracts symbolic rules from neural network to increase the performance of decision process. Experimental studies on a set of natural domains show that FNT has clear advantages with respect to the generalization ability.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Huan, L., Rudy, S.: Feature Transformation and Multivariate Decision Tree Induction. In: Arikawa, S., Motoda, H. (eds.) DS 1998. LNCS (LNAI), vol. 1532, pp. 279–291. Springer, Heidelberg (1998)

    Google Scholar 

  2. Masi, G.B., Stettner, D.L.: Bayesian Adaptive Control of Discrete-time Markov Processes with Long Run Average Cost. Int. J. Systems & Control Letters 34(3), 55–62 (1998)

    Article  MATH  Google Scholar 

  3. Hunter, A.: Feature Selection Using Probabilistic Neural Networks. Int. J. Neural Computation & Application 9(2), 124–132 (2000)

    Article  Google Scholar 

  4. Atlas, L., Cole, R., Uthusamy, M., Lippman, A.: A Performance Comparison of Trained Multi-layer Perceptions and Trained Classification Trees. In: Zhong, S., Malla, S. (eds.) Proceedings of the IEEE International Conference on Computer Vision, Osaka, Japan, vol. 78, pp. 1614–1619 (1990)

    Google Scholar 

  5. Setiono, R., Huan, L.: Symbolic Representation of Neural Networks. Int. J. Computer 29(5), 71–77 (1996)

    Google Scholar 

  6. Zhou, Z.H., Chen, Z.Q.: Hybrid Decision Tree. Knowledge-Based Systems 15(8), 515–528 (2002)

    Article  Google Scholar 

  7. Dougherty, J.: Supervised and Unsupervied Discretization of Coninuous Features. In: Armand, P., Stuart, J.R. (eds.) Proceedings of the 12th International Conference on Machine Learning, Tahoe City, California, USA, pp. 194–201 (1995)

    Google Scholar 

  8. Zhou, Z.H., Chen, S.F., Chen, Z.Q.: A FANNC: Fast Adaptive Neural Network Classifier. Int. J. Knowledge and Information Systems 1(2), 115–129 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, HJ., Wang, ZX., Wang, LM., Yuan, SM. (2006). Flexible Neural Tree for Pattern Recognition. In: Wang, J., Yi, Z., Zurada, J.M., Lu, BL., Yin, H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture Notes in Computer Science, vol 3971. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11759966_132

Download citation

  • DOI: https://doi.org/10.1007/11759966_132

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34439-1

  • Online ISBN: 978-3-540-34440-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics