Abstract
Radial basis function (RBF) models have been successfully employed to study a broad range of data mining problems and benchmark data sets for real world scientific and engineering applications. In this paper we investigate RBF models with Gaussian kernels by developing classifiers in a systematic way. In particular, we employ our newly developed RBF design algorithm for a detailed performance study and sensitivity analysis of the classification models for the popular Monk’s problems. The results show that the accuracy of our classifiers is very impressive while our classification approach is systematic and easy to implement. In addition, differing complexity of the three Monk’s problems is clearly reflected in the classification error surfaces for test data. By exploring these surfaces, we acquire better understanding of the data mining classification problems. Finally, we study the error surfaces to investigate trade-offs between different choices of model parameters to develop efficient and parsimonious models for a given application.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Goel, A.L., Shin, M.: Radial basis functions: an algebraic approach (with data mining applications). In: Tutorial notes, European conference on Machine Learning, September 2004, Pisa, Italy (2004)
Shin, M., Goel, A.L.: Empirical data modeling in software engineering using radial basis functions. IEEE Trans. on Software Engineering 26(6), 567–576 (2000)
Shin, M., Goel, A.L.: Modeling software component criticality using a machine learning approach. In: Kim, T.G. (ed.) AIS 2004. LNCS (LNAI), vol. 3397, pp. 440–448. Springer, Heidelberg (2005)
Thrun, S.B., et al.: The Monk’s problems: a performance comparison of different learning algorithms. Technical Report CMU-CS-91-197, Carnegie Mellon University (1991)
Saxon, S., Barry, A.: XCS and the Monk’s problems in learning classifier systems: from foundations to applications. In: Lanzi, P.L., Stolzmann, W., Wilson, S.W. (eds.) IWLCS 1999. LNCS (LNAI), vol. 1813, pp. 440–448. Springer, Heidelberg (2000)
Casey, M., Ahmad, K.: In-situ learning in multi-net systems. In: Yang, Z.R., Yin, H., Everson, R.M. (eds.) IDEAL 2004. LNCS, vol. 3177, pp. 752–757. Springer, Heidelberg (2004)
Xiong, H., Swamy, M.N.S., Ahmad, M.O.: Optimizing the kernel in the empirical feature space. IEEE Trans. on Neural Networks (March 2005)
Mitchell, M.W.: An architecture for situated learning agents. Ph.D. Dissertation, Monash University, Australia (2003)
Huang, S.H.: Dimensionality reduction in automatic knowledge acquisition:a simple greedy search approach. IEEE Trans. on Knowledge and Data Engineeingr. 16(6), 1364–1373 (2003)
Toh, K., Tran, Q.-L., Srinivasan, O.: Benchmarking a reduced multivariate polynomial pattern classifier. IEEE Trans. on Pattern Analysis and Machine Intelligence 16(2), 460–474 (2005)
UCI machine learning data repository. http://www.ics.uci.edu/~mlearn/MLRepository.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Shin, M. (2006). A Performance Study of Gaussian Kernel Classifiers for Data Mining Applications. In: Li, X., Zaïane, O.R., Li, Z. (eds) Advanced Data Mining and Applications. ADMA 2006. Lecture Notes in Computer Science(), vol 4093. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11811305_21
Download citation
DOI: https://doi.org/10.1007/11811305_21
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37025-3
Online ISBN: 978-3-540-37026-0
eBook Packages: Computer ScienceComputer Science (R0)