Skip to main content

Support Vector Machine Tree Based on Feature Selection

  • Conference paper
Neural Information Processing (ICONIP 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4232))

Included in the following conference series:

Abstract

A hybrid learning approach named confusion-cross-based support vector machine tree (CSVMT) has been proposed in our current work. It is developed to achieve a better performance for complex distribution problems even when the two parameters of SVM are not appropriately selected. One problem remained is that the trained internal nodes may be high complex for those high-dimensional feature space problems due to undesirable complexity added to the underlying probability distribution of the concept label for learning algorithm to capture –thus, learning models with high complexity are likely to depress the test efficiency and performance. In this paper, we proposed a feature selection based CSVMT (FS-SCVMT) learning approach in which the input space for each internal node is adaptively dimensionality reduced by sensitivity based feature selection. Experimental results showed that FS-SCVMT approach performed well.

This work was supported by the Project of NSFC(No. 60496310, 60102011, BK2001042), and a Foundation of Excellent Doctoral Dissertation of Southeast University (YBJJ0412).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Mizuno, S., Zhao, Q.F.: Neural network trees with nodes of limited inputs are good for learning and understanding. In: Proc. 4th Asia-Pacific Conference on Simulated Evolution and Learning, Singapore, pp. 573–576 (2002)

    Google Scholar 

  2. Zhou, Z.H., Chen, Z.Q.: Hybrid decision tree. Knowledge-Based Systems 15, 515–518 (2002)

    Article  MathSciNet  Google Scholar 

  3. Xu, Q.Z., Zhao, Q.F., Pei, W.J., Yang, L.X., He, Z.Y.: Design interpretable neural network trees through self-organized learning of features. In: Proc. International Joint Conference on Neural Networks, Hungary, pp. 1433–1438 (2004)

    Google Scholar 

  4. Brent, R.P.: Fast training algorithm for multilayer neural nets. IEEE Trans. Neural Networks 2, 346–354 (1991)

    Article  Google Scholar 

  5. Kubat, M.: Decision trees can initialize radial-basis function networks. IEEE Trans. Neural Networks 9, 813–821 (1998)

    Article  Google Scholar 

  6. Tsang, E.C.C., Wang, X.Z., Yeung, D.S.: Improving Learning Accuracy of Fuzzy Decision Trees by Hybrid Neural Networks. IEEE Trans. Fuzzy Systems 8, 601–614 (2000)

    Article  Google Scholar 

  7. Guo, H., Gelfand, S.B.: Classification trees with neural network feature extraction. IEEE Trans. Neural Networks 3, 923–933 (1992)

    Article  Google Scholar 

  8. Krishnan, R., Sivakumar, G., Bhattacharya, P.: Extracting decision trees from trained neural networks. Pattern Recognition 32, 1999–2009 (1999)

    Article  Google Scholar 

  9. Schmitz, G.P.J., Aldrich, C., Gouws, F.S.: ANN-DT: an algorithm for extraction of decision trees from artificial neural networks. IEEE Trans. neural networks 10, 1392–1401 (1999)

    Article  Google Scholar 

  10. Cheong, S., Oh, S.H., Lee, S.Y.: Support vector machines with binary tree architecture for multi-class classification. Neural Information Processing–Letters and Reviews 2, 47–51 (2004)

    Google Scholar 

  11. Zhao, Q.F.: A new method for efficient design of neural network trees. Technical report of IEICE, PRMU2004-115, 59–64 (2004)

    Google Scholar 

  12. Bennett, K., Cristianini, N., Shaw-Taylor, J., Wu, D.: Enlarging margins in perceptron decision trees. Machine Learning 41, 295–313 (2000)

    Article  MATH  Google Scholar 

  13. Xu, Q.Z., Song, A.G., Pei, W.J., Yang, L.X., He, Z.Y.: Tree-structured Support Vector Machine With Confusion Cross For Complex Pattern Recognition Problems. In: Proc. IEEE International Workshop on VLSI Design and Video Technology, pp. 195–198 (2005)

    Google Scholar 

  14. Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances in kernel methods: support vector learning, pp. 185–208. MIT Press, Cambridge (1999)

    Google Scholar 

  15. Belousov, A.I., Verzakov, S.A., von Frese, J.: A flexible classification approach with optimal generalization performance: Support vector machines. Chemometrics and Intelligent Laboratory Systems 64, 15–25 (2002)

    Article  Google Scholar 

  16. Keerthi, S.S., Lin, C.J.: Asymptotic behaviors of support vector machines with Gaussian kernel. Neural Computation 15, 1667–1689 (2003)

    Article  MATH  Google Scholar 

  17. Sindhwani, V., Rakshit, S., Deodhare, D., Erdogmus, D., Principe, J.C., Nivogi, P.: Feature selection in MLPs and SVMs based on maximum output information. IEEE Trans. Neural Networks 15, 937–948 (2004)

    Article  Google Scholar 

  18. Hsu, C.W., Lin, C.J.: A Comparison of Methods for Multiclass Support Vector Machines. IEEE Trans. Neural Networks 13, 415–425 (2002)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Xu, Q., Pei, W., Yang, L., He, Z. (2006). Support Vector Machine Tree Based on Feature Selection. In: King, I., Wang, J., Chan, LW., Wang, D. (eds) Neural Information Processing. ICONIP 2006. Lecture Notes in Computer Science, vol 4232. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893028_95

Download citation

  • DOI: https://doi.org/10.1007/11893028_95

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-46479-2

  • Online ISBN: 978-3-540-46480-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics