Abstract
A hybrid learning approach named confusion-cross-based support vector machine tree (CSVMT) has been proposed in our current work. It is developed to achieve a better performance for complex distribution problems even when the two parameters of SVM are not appropriately selected. One problem remained is that the trained internal nodes may be high complex for those high-dimensional feature space problems due to undesirable complexity added to the underlying probability distribution of the concept label for learning algorithm to capture –thus, learning models with high complexity are likely to depress the test efficiency and performance. In this paper, we proposed a feature selection based CSVMT (FS-SCVMT) learning approach in which the input space for each internal node is adaptively dimensionality reduced by sensitivity based feature selection. Experimental results showed that FS-SCVMT approach performed well.
This work was supported by the Project of NSFC(No. 60496310, 60102011, BK2001042), and a Foundation of Excellent Doctoral Dissertation of Southeast University (YBJJ0412).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Mizuno, S., Zhao, Q.F.: Neural network trees with nodes of limited inputs are good for learning and understanding. In: Proc. 4th Asia-Pacific Conference on Simulated Evolution and Learning, Singapore, pp. 573–576 (2002)
Zhou, Z.H., Chen, Z.Q.: Hybrid decision tree. Knowledge-Based Systems 15, 515–518 (2002)
Xu, Q.Z., Zhao, Q.F., Pei, W.J., Yang, L.X., He, Z.Y.: Design interpretable neural network trees through self-organized learning of features. In: Proc. International Joint Conference on Neural Networks, Hungary, pp. 1433–1438 (2004)
Brent, R.P.: Fast training algorithm for multilayer neural nets. IEEE Trans. Neural Networks 2, 346–354 (1991)
Kubat, M.: Decision trees can initialize radial-basis function networks. IEEE Trans. Neural Networks 9, 813–821 (1998)
Tsang, E.C.C., Wang, X.Z., Yeung, D.S.: Improving Learning Accuracy of Fuzzy Decision Trees by Hybrid Neural Networks. IEEE Trans. Fuzzy Systems 8, 601–614 (2000)
Guo, H., Gelfand, S.B.: Classification trees with neural network feature extraction. IEEE Trans. Neural Networks 3, 923–933 (1992)
Krishnan, R., Sivakumar, G., Bhattacharya, P.: Extracting decision trees from trained neural networks. Pattern Recognition 32, 1999–2009 (1999)
Schmitz, G.P.J., Aldrich, C., Gouws, F.S.: ANN-DT: an algorithm for extraction of decision trees from artificial neural networks. IEEE Trans. neural networks 10, 1392–1401 (1999)
Cheong, S., Oh, S.H., Lee, S.Y.: Support vector machines with binary tree architecture for multi-class classification. Neural Information Processing–Letters and Reviews 2, 47–51 (2004)
Zhao, Q.F.: A new method for efficient design of neural network trees. Technical report of IEICE, PRMU2004-115, 59–64 (2004)
Bennett, K., Cristianini, N., Shaw-Taylor, J., Wu, D.: Enlarging margins in perceptron decision trees. Machine Learning 41, 295–313 (2000)
Xu, Q.Z., Song, A.G., Pei, W.J., Yang, L.X., He, Z.Y.: Tree-structured Support Vector Machine With Confusion Cross For Complex Pattern Recognition Problems. In: Proc. IEEE International Workshop on VLSI Design and Video Technology, pp. 195–198 (2005)
Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances in kernel methods: support vector learning, pp. 185–208. MIT Press, Cambridge (1999)
Belousov, A.I., Verzakov, S.A., von Frese, J.: A flexible classification approach with optimal generalization performance: Support vector machines. Chemometrics and Intelligent Laboratory Systems 64, 15–25 (2002)
Keerthi, S.S., Lin, C.J.: Asymptotic behaviors of support vector machines with Gaussian kernel. Neural Computation 15, 1667–1689 (2003)
Sindhwani, V., Rakshit, S., Deodhare, D., Erdogmus, D., Principe, J.C., Nivogi, P.: Feature selection in MLPs and SVMs based on maximum output information. IEEE Trans. Neural Networks 15, 937–948 (2004)
Hsu, C.W., Lin, C.J.: A Comparison of Methods for Multiclass Support Vector Machines. IEEE Trans. Neural Networks 13, 415–425 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Xu, Q., Pei, W., Yang, L., He, Z. (2006). Support Vector Machine Tree Based on Feature Selection. In: King, I., Wang, J., Chan, LW., Wang, D. (eds) Neural Information Processing. ICONIP 2006. Lecture Notes in Computer Science, vol 4232. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893028_95
Download citation
DOI: https://doi.org/10.1007/11893028_95
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46479-2
Online ISBN: 978-3-540-46480-8
eBook Packages: Computer ScienceComputer Science (R0)