Since support vector machines (SVM) [7–9] demonstrate a good accuracy in classification and regression, rule extraction from a trained SVM (SVM-Rule) procedure is important for data mining and knowledge discovery [1–6, 29, 31]. However, the obtained rules from SVM-Rule in practice are less comprehensible than our expectation because there is a big number of incomprehensible numerical parameters (i.e., support vectors) turned up in those rules. Compared to SVM-Rule, decision-tree is a simple, but very efficient rule extraction method in terms of comprehensibility [33]. The obtained rules from decision tree may not be so accurate as SVM rules, but they are easy to comprehend because that every rule represents one decision path that is traceable in the decision tree.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
H. Nunez, C. Angulo, and A. Catala (2002) Rule-extraction from Support vector Machines. The European Symposiumon Aritificial Neural Netorks, Burges, pp. 107–112.
Y. Zhang, H. Y. Su, T. Jia, and J. Chu (2005) Rule Extraction from Trained Support Vector Machines, PAKDD 2005, LANI3518, pp. 61–70, Springer-Verlag Berling Heidelberg.
Lipo Wang and Xiuju Fu (2005) Rule Extraction from Support Vector Machine. In: Data Mining with Computational Intelligence, nced Information and Knowlegde Processing, Springer Berlin Heidelberg.
N. Barakat and Andrew P. Bradley (2006) Rule Extraction from Support Vector Machines: Measuring the Explanation Capability Using the Area under the ROC Curve. In The 18th International Conference on Pattern Recognition (ICPR’06), August, 2006, Hong Kong.
Glenn Fung, Sathyakama Sandilya, and Bharat Rao (2005) Rule Extraction for Linear Support Vector Machines, KDD2005, August 2124, 2005, Chicago, Illinois, USA.
Xueju Fu, C. Ong, S. Keerthi, G. G. Huang, and L. Goh (2004) Proceedings of IEEE International Joint Conference on Neural Networks, Vol. 1, no. 25–29 July 2004, pp. 291–296.
V. Vapnik, Estimation of dependences based on empirical data. Springer-Verlag, 1982.
V. Vapnik, The nature of statistical learning theory, New York: Spinger-Verlag, 1995.
C. Cortes and V. Vapnik, “Support vector network,” Machine learning, vol. 20, 273–297, 1995.
Shoning Pang, D. Kim, and S. Y. Bang, “Membership authentication in the dynamic group by face classification using SVM ensemble,” Pattern Recognition Letters, Vol. 24, pp. 215–225, 2003.
Shaoning Pang, SVM Aggregation: SVM, SVM Ensemble, SVM Classification Tree, IEEE SMC eNewsletter Dec. 2005. http://www.ieeesmc.org/Newsletter/Dec2005/R11Pang.php
Shaoning Pang, D. Kim, and S. Y. Bang, “Face Membership Authentication Using SVM Classification Tree Generated by Membership-based LLE Data Partition,” IEEE Trans. on Neural Network Vol. 16, no. 2, pp. 436–446 Mar. 2005.
J.C. Schölkopf, J.C. Platt, J. Shawe-Taylor, A.J. Smola, and R.C. Williamson, “Estimating the support of a high-dimensional distribution,” Technical report, Microsoft Research, MSR-TR-99-87, 1999.
Tax D.M.J. (2001) One-class Classification, concept-learning in the absence of counter-examples. Ph.D. Thesis.
Tax D.M.J. and Duin R.P.W. (2001) Combining one-class classifiers. LNCS 2096: 299–308.
Xu Y. and Brereton R. G. (2005) Diagnostic pattern recognition on gene expression profile data by using one-class classifiers. J. Chem. Inf. Model. 45:1392–1401.
J. R. Quinlan, C4.5: Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann, 1993.
Hyun-Chul Kim, Shaoning Pang, Hong-Mo Je, Daijin Kim,and Sung Yang Bang, “Constructing support vector machine ensemble,” Pattern Recognition, vol. 36, no. 12, pp. 2757–2767.
M. A. Shipp, K. N. Ross, et al., (2002) “Supplementary Information for Diffuse large B-cell lymphoma outcome prediction by gene-expression profiling and supervised machine learning,” Nature Medicine, 8(1):68–74, (2002).
T. R. Golub, “Toward a functional taxonomy of cancer,” Cancer Cell, 6(2):107–8,(2004).
S. Pomeroy, P. Tamayo, et al., “Prediction of Central Nervous System Embryonal Tumour Outcome Based on Gene Expression,” Nature, 415(6870), 436–442, 2002.
U. Alon, N. Barkai, et al., Broad Patterns of Gene Expression Revealed by Clustering Analysis of Tumor and Normal Colon Tissues Probed by Oligonucleotide Arrays, Proc Natl Acad Sci, USA, (1999).
E. F. Petricoin, A. M. Ardekani, et al., “Use of Proteomic Patterns in Serum to Identify Ovarian Cancer,” Lancet, 359, 572–577, (2002).
Van’t Veer L. J. et al., “Gene expression profiling predicts clinical outcome of breast cancer,” Nature 415:530–536, (2002).
G. J. Gordon, R. Jensen, et al., Translation of Microarray Data into Clinically Relevant Cancer Diagnostic Tests Using Gene Expression Ratios in Lung Cancer And Mesothelioma,” Cancer Research, 62, 4963–4967, (2002).
Dudoit, S., Yang, Y. H., Callow, M. J., and Speed, T. P. (2002) Statistical methods for identifying differentially expressed genes in replicated cDNA microarray experiments. Stat. Sinica, 12, 111–139.
Tin Kam Ho, “The random subspace method for constructing decision forests Tin Kam Ho,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(8), pp. 832–844, Aug. 1998.
NeuCom - A Neuro-computing Decision Support Enviroment, Knowledge Engineering and Discovery Research Institute, Auckland University of Technology, www.theneucom.com.
H. Nez, C. Angulo and Andreu Catal, “Hybrid Architecture Based on Support Vector Machines,” Lecture Notes in Computer Science Volume 2686, in Book Computational Methods in Neural Modeling, pp. 646–653, 2003.
Z. H. Zhou and Y. Jiang, “Medical Diagnosis with C4.5 Rule Preceded by Artificial Neural Netowrk Ensemble,” IEEE Trans. on Information Technology in Biomedicine, 7(1):37–42, 2003.
Yixin Chen and J. Z. Wang, J. Z., “Support vector learning for fuzzy rule-based classification systems, IEEE Transactions on Fuzzy Systems 11(6), pp. 716–728, 2003.
H. Nunez, C. Angulo, and A. Catala, “Support vector machines with symbolic interpretation,” Proceedings. VII Brazilian Symposium on Neural Networks, pp. 142–147, 11–14 Nov. 2002.
W. Duch, R. Setiono, and J. M. Zurada, “Computational intelligence methods for rule-based data understanding,” Proc. of the IEEE, 92(5), pp. 771–805, May 2004.
Shaoning Pang, Dajin Kim, S. Y. Bang (2001) Fraud Detection Using Support Vector Machine Ensemble. ICONIP2001, Shanghai, China.
Y. S. Chen and T. H. Chu, “A neural network classification tree,” Proc. IEEE Int. Conf. Neural Networks, Vol. 1 , Nov. 27-Dec. 1, 1995, pp. 409–413.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Pang, S., Kasabov, N. (2008). SVMT-Rule: Association Rule Mining Over SVM Classification Trees. In: Diederich, J. (eds) Rule Extraction from Support Vector Machines. Studies in Computational Intelligence, vol 80. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-75390-2_6
Download citation
DOI: https://doi.org/10.1007/978-3-540-75390-2_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-75389-6
Online ISBN: 978-3-540-75390-2
eBook Packages: EngineeringEngineering (R0)