Abstract
We present a method for complexity reduction of kernel nonlinear classifiers by a sparsity process applied on the training data set. This is achieved through the neural growing gas clustering method in feature space. The kernel ”trick” is used to extract a relevant data set into the feature space according to a geometrical consideration. Classical algorithms, SMO and Adatron learning, are then applied on the selected data to obtain the kernel machine solution. The approach covers a wide range of algorithms and improves current methods of sparse kernel based classification in two aspects. First, reduces the complexity of the training phase by extracting from the training data set a small representative subset of training instances. Second, the kernel machine obtained as solution is extremely sparse without any reduction of its generalization capacity. The method is experimentally compared using synthetic and real data benchmarks, with results of other proposed sparse kernel based classification methods.
Chapter PDF
Similar content being viewed by others
Keywords
References
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (1982)
Vapnik, V.N.: Statistical Learning Theory. Wiley Interscience, New York (1998)
Herbrich, R.: Learning Kernel Classifiers. The MIT Press, Cambridge (2002)
Shölkopf, B., Smola, A.: Lerning with Kernels. The MIT Press, Cambridge (2002)
Fritzke, B.: Some competitive learning methods. Technical Report from the Systems Biophysics Institute for Neural Computation, Ruhr Universitaet Bochum (1997)
Burges, C.J.C., Shölkopf, B.: Improving the accuracy and speed of support vector machines. In: Advances in Neural Information Processing Systems, vol. 9. The MIT Press, Cambridge (1997)
Downs, T., Gates, K., Masters, A.: Exact simplification of support vectors solutions. Journal of Machine Learning Research 2, 293–297 (2001)
Lee, Y., Mangasarian, O.L.: RSVM: Reduced Support Vector Machines. In: Proc. 1st SIAM Int. Conf. Data Mining (2001)
Baudat, G., Anouar, F.: Kernel-based Methods and Function Approximation, www.esat.kuleuven.ac.be/~kpelckma/research/research/kernel/baudat2001.pdf
Engel, Y., Mannor, S., Meir, R.: Sparse Online Greedy Support Vector Regression, www-ee.technion.ac.il/~rmeir/Publications/Ecml02Sogsvr.pdf
Lin, K.-M., LIN, C.-J.: A Study on Reduced Support Vector Machines. IEEE Trans. Neural Networks 14, 1449–1459 (2003)
Platt, J.: Fast training of Support Vector Machines using sequential minimal optimization. In: Advances in Kernel Methods Support Vector Learning, pp. 42–65. The MIT Press, Cambridge (1999)
Anlauf, J.K., Biehl, M.: The Adatron: an adaptive perceptron algorithm. Europhysics Letters 10, 687–692 (1989)
Friess, T., Cristianini, N., Campbell, C.: The Kernel-Adatron Algorithm: a Fast and simple Learning Procedure for Support Vector Machines. In: Shavlik, J. (ed.) Machine Learning: Proceedings of the Fifteenth International Conference. Morgan Kaufmann Publishers, San Francisco (1998)
Ho, T.K., Kleinberg, E.M.: Building projectable classifiers of arbitrary complexity. In: Proceedings of the 13th International Conference on Pattern Recognition, Vienna, Austria, pp. 880–885 (1966)
Murphy, P.M., Aha, D.W.: UCI repository of machine learning databases (1992), www.ics.uci.edu/~mlearn/MLRepository.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
D’Amato, L., Moreno, J.A., Mujica, R. (2004). Reducing the Complexity of Kernel Machines with Neural Growing Gas in Feature Space. In: Lemaître, C., Reyes, C.A., González, J.A. (eds) Advances in Artificial Intelligence – IBERAMIA 2004. IBERAMIA 2004. Lecture Notes in Computer Science(), vol 3315. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30498-2_80
Download citation
DOI: https://doi.org/10.1007/978-3-540-30498-2_80
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23806-5
Online ISBN: 978-3-540-30498-2
eBook Packages: Springer Book Archive