Abstract
This paper presents a novel induction model named Flexible Neural Tree (FNT) for pattern recognition. FNT uses decision tree to do basic analysis and neural network to do subsequent quantitative analysis. The Pure Information Gain I(X i ;ϑ), which is defined as test selection measure for FNT to construct decision tree, can be used to handle continuous attributes directly. When the information embodied by neural network node can show new attribute relations, FNT extracts symbolic rules from neural network to increase the performance of decision process. Experimental studies on a set of natural domains show that FNT has clear advantages with respect to the generalization ability.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Huan, L., Rudy, S.: Feature Transformation and Multivariate Decision Tree Induction. In: Arikawa, S., Motoda, H. (eds.) DS 1998. LNCS (LNAI), vol. 1532, pp. 279–291. Springer, Heidelberg (1998)
Masi, G.B., Stettner, D.L.: Bayesian Adaptive Control of Discrete-time Markov Processes with Long Run Average Cost. Int. J. Systems & Control Letters 34(3), 55–62 (1998)
Hunter, A.: Feature Selection Using Probabilistic Neural Networks. Int. J. Neural Computation & Application 9(2), 124–132 (2000)
Atlas, L., Cole, R., Uthusamy, M., Lippman, A.: A Performance Comparison of Trained Multi-layer Perceptions and Trained Classification Trees. In: Zhong, S., Malla, S. (eds.) Proceedings of the IEEE International Conference on Computer Vision, Osaka, Japan, vol. 78, pp. 1614–1619 (1990)
Setiono, R., Huan, L.: Symbolic Representation of Neural Networks. Int. J. Computer 29(5), 71–77 (1996)
Zhou, Z.H., Chen, Z.Q.: Hybrid Decision Tree. Knowledge-Based Systems 15(8), 515–528 (2002)
Dougherty, J.: Supervised and Unsupervied Discretization of Coninuous Features. In: Armand, P., Stuart, J.R. (eds.) Proceedings of the 12th International Conference on Machine Learning, Tahoe City, California, USA, pp. 194–201 (1995)
Zhou, Z.H., Chen, S.F., Chen, Z.Q.: A FANNC: Fast Adaptive Neural Network Classifier. Int. J. Knowledge and Information Systems 1(2), 115–129 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, HJ., Wang, ZX., Wang, LM., Yuan, SM. (2006). Flexible Neural Tree for Pattern Recognition. In: Wang, J., Yi, Z., Zurada, J.M., Lu, BL., Yin, H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture Notes in Computer Science, vol 3971. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11759966_132
Download citation
DOI: https://doi.org/10.1007/11759966_132
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34439-1
Online ISBN: 978-3-540-34440-7
eBook Packages: Computer ScienceComputer Science (R0)