Skip to main content

SVMT-Rule: Association Rule Mining Over SVM Classification Trees

  • Chapter
Rule Extraction from Support Vector Machines

Part of the book series: Studies in Computational Intelligence ((SCI,volume 80))

Since support vector machines (SVM) [7–9] demonstrate a good accuracy in classification and regression, rule extraction from a trained SVM (SVM-Rule) procedure is important for data mining and knowledge discovery [1–6, 29, 31]. However, the obtained rules from SVM-Rule in practice are less comprehensible than our expectation because there is a big number of incomprehensible numerical parameters (i.e., support vectors) turned up in those rules. Compared to SVM-Rule, decision-tree is a simple, but very efficient rule extraction method in terms of comprehensibility [33]. The obtained rules from decision tree may not be so accurate as SVM rules, but they are easy to comprehend because that every rule represents one decision path that is traceable in the decision tree.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. H. Nunez, C. Angulo, and A. Catala (2002) Rule-extraction from Support vector Machines. The European Symposiumon Aritificial Neural Netorks, Burges, pp. 107–112.

    Google Scholar 

  2. Y. Zhang, H. Y. Su, T. Jia, and J. Chu (2005) Rule Extraction from Trained Support Vector Machines, PAKDD 2005, LANI3518, pp. 61–70, Springer-Verlag Berling Heidelberg.

    Google Scholar 

  3. Lipo Wang and Xiuju Fu (2005) Rule Extraction from Support Vector Machine. In: Data Mining with Computational Intelligence, nced Information and Knowlegde Processing, Springer Berlin Heidelberg.

    Google Scholar 

  4. N. Barakat and Andrew P. Bradley (2006) Rule Extraction from Support Vector Machines: Measuring the Explanation Capability Using the Area under the ROC Curve. In The 18th International Conference on Pattern Recognition (ICPR’06), August, 2006, Hong Kong.

    Google Scholar 

  5. Glenn Fung, Sathyakama Sandilya, and Bharat Rao (2005) Rule Extraction for Linear Support Vector Machines, KDD2005, August 2124, 2005, Chicago, Illinois, USA.

    Google Scholar 

  6. Xueju Fu, C. Ong, S. Keerthi, G. G. Huang, and L. Goh (2004) Proceedings of IEEE International Joint Conference on Neural Networks, Vol. 1, no. 25–29 July 2004, pp. 291–296.

    Google Scholar 

  7. V. Vapnik, Estimation of dependences based on empirical data. Springer-Verlag, 1982.

    Google Scholar 

  8. V. Vapnik, The nature of statistical learning theory, New York: Spinger-Verlag, 1995.

    MATH  Google Scholar 

  9. C. Cortes and V. Vapnik, “Support vector network,” Machine learning, vol. 20, 273–297, 1995.

    MATH  Google Scholar 

  10. Shoning Pang, D. Kim, and S. Y. Bang, “Membership authentication in the dynamic group by face classification using SVM ensemble,” Pattern Recognition Letters, Vol. 24, pp. 215–225, 2003.

    Article  MATH  Google Scholar 

  11. Shaoning Pang, SVM Aggregation: SVM, SVM Ensemble, SVM Classification Tree, IEEE SMC eNewsletter Dec. 2005. http://www.ieeesmc.org/Newsletter/Dec2005/R11Pang.php

  12. Shaoning Pang, D. Kim, and S. Y. Bang, “Face Membership Authentication Using SVM Classification Tree Generated by Membership-based LLE Data Partition,” IEEE Trans. on Neural Network Vol. 16, no. 2, pp. 436–446 Mar. 2005.

    Article  Google Scholar 

  13. J.C. Schölkopf, J.C. Platt, J. Shawe-Taylor, A.J. Smola, and R.C. Williamson, “Estimating the support of a high-dimensional distribution,” Technical report, Microsoft Research, MSR-TR-99-87, 1999.

    Google Scholar 

  14. Tax D.M.J. (2001) One-class Classification, concept-learning in the absence of counter-examples. Ph.D. Thesis.

    Google Scholar 

  15. Tax D.M.J. and Duin R.P.W. (2001) Combining one-class classifiers. LNCS 2096: 299–308.

    MathSciNet  Google Scholar 

  16. Xu Y. and Brereton R. G. (2005) Diagnostic pattern recognition on gene expression profile data by using one-class classifiers. J. Chem. Inf. Model. 45:1392–1401.

    Article  Google Scholar 

  17. J. R. Quinlan, C4.5: Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann, 1993.

    Google Scholar 

  18. Hyun-Chul Kim, Shaoning Pang, Hong-Mo Je, Daijin Kim,and Sung Yang Bang, “Constructing support vector machine ensemble,” Pattern Recognition, vol. 36, no. 12, pp. 2757–2767.

    Google Scholar 

  19. M. A. Shipp, K. N. Ross, et al., (2002) “Supplementary Information for Diffuse large B-cell lymphoma outcome prediction by gene-expression profiling and supervised machine learning,” Nature Medicine, 8(1):68–74, (2002).

    Article  Google Scholar 

  20. T. R. Golub, “Toward a functional taxonomy of cancer,” Cancer Cell, 6(2):107–8,(2004).

    Article  MathSciNet  Google Scholar 

  21. S. Pomeroy, P. Tamayo, et al., “Prediction of Central Nervous System Embryonal Tumour Outcome Based on Gene Expression,” Nature, 415(6870), 436–442, 2002.

    Article  Google Scholar 

  22. U. Alon, N. Barkai, et al., Broad Patterns of Gene Expression Revealed by Clustering Analysis of Tumor and Normal Colon Tissues Probed by Oligonucleotide Arrays, Proc Natl Acad Sci, USA, (1999).

    Google Scholar 

  23. E. F. Petricoin, A. M. Ardekani, et al., “Use of Proteomic Patterns in Serum to Identify Ovarian Cancer,” Lancet, 359, 572–577, (2002).

    Article  Google Scholar 

  24. Van’t Veer L. J. et al., “Gene expression profiling predicts clinical outcome of breast cancer,” Nature 415:530–536, (2002).

    Article  Google Scholar 

  25. G. J. Gordon, R. Jensen, et al., Translation of Microarray Data into Clinically Relevant Cancer Diagnostic Tests Using Gene Expression Ratios in Lung Cancer And Mesothelioma,” Cancer Research, 62, 4963–4967, (2002).

    Google Scholar 

  26. Dudoit, S., Yang, Y. H., Callow, M. J., and Speed, T. P. (2002) Statistical methods for identifying differentially expressed genes in replicated cDNA microarray experiments. Stat. Sinica, 12, 111–139.

    MATH  MathSciNet  Google Scholar 

  27. Tin Kam Ho, “The random subspace method for constructing decision forests Tin Kam Ho,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(8), pp. 832–844, Aug. 1998.

    Article  Google Scholar 

  28. NeuCom - A Neuro-computing Decision Support Enviroment, Knowledge Engineering and Discovery Research Institute, Auckland University of Technology, www.theneucom.com.

  29. H. Nez, C. Angulo and Andreu Catal, “Hybrid Architecture Based on Support Vector Machines,” Lecture Notes in Computer Science Volume 2686, in Book Computational Methods in Neural Modeling, pp. 646–653, 2003.

    Google Scholar 

  30. Z. H. Zhou and Y. Jiang, “Medical Diagnosis with C4.5 Rule Preceded by Artificial Neural Netowrk Ensemble,” IEEE Trans. on Information Technology in Biomedicine, 7(1):37–42, 2003.

    Article  MathSciNet  Google Scholar 

  31. Yixin Chen and J. Z. Wang, J. Z., “Support vector learning for fuzzy rule-based classification systems, IEEE Transactions on Fuzzy Systems 11(6), pp. 716–728, 2003.

    Article  Google Scholar 

  32. H. Nunez, C. Angulo, and A. Catala, “Support vector machines with symbolic interpretation,” Proceedings. VII Brazilian Symposium on Neural Networks, pp. 142–147, 11–14 Nov. 2002.

    Google Scholar 

  33. W. Duch, R. Setiono, and J. M. Zurada, “Computational intelligence methods for rule-based data understanding,” Proc. of the IEEE, 92(5), pp. 771–805, May 2004.

    Article  Google Scholar 

  34. Shaoning Pang, Dajin Kim, S. Y. Bang (2001) Fraud Detection Using Support Vector Machine Ensemble. ICONIP2001, Shanghai, China.

    Google Scholar 

  35. Y. S. Chen and T. H. Chu, “A neural network classification tree,” Proc. IEEE Int. Conf. Neural Networks, Vol. 1 , Nov. 27-Dec. 1, 1995, pp. 409–413.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Pang, S., Kasabov, N. (2008). SVMT-Rule: Association Rule Mining Over SVM Classification Trees. In: Diederich, J. (eds) Rule Extraction from Support Vector Machines. Studies in Computational Intelligence, vol 80. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-75390-2_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-75390-2_6

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-75389-6

  • Online ISBN: 978-3-540-75390-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics