Abstract
Support vector machines(SVM) have been introduced for pattern recognition and regression. But it was limited by the time consuming and the choice of kernel function in practical application. Motivated by the theory of multi-scale representations of signals and wavelet transforms, this paper presents a way for building a wavelet-based reproducing kernel Hilbert spaces (RKHS) which is a multiresolution scale subspace and its associate scaling kernel for least squares support vector machines (LS-SVM). The scaling kernel is constructed by using a scaling function with its different dilations and translations. Results on several approximation problems illustrate that the LS-SVM with scaling kernel can approximate arbitrary signal with multi-scale and owns better approximation performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Müller, K.R., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.: An introduction to kernel-based learning algorithms. IEEE Trans Neural Networks 12(2), 181–201 (2001)
Cristianini, N., Taylor, J.S.: An Introduction to Support Vector machines. Cambridge University Press, Cambridge (2000)
Suykens, J.A.K., Gestel, T.V., Brabanter, J.D., et al.: Least Squares Support Vector Machines. World Scientific Pub. Co., Singapore (2002)
Suykens, J.A.K., Vandewalle, J.: Recurrent least squares support vector machines. IEEE Trans. Circuits and Syst. I. 47, 1109–1114 (2000)
Zhang, L., Zhou, W.D., Jiao, L.C.: Wavelet support vector machine. IEEE Trans. System, Man, Cybernetics 34, 34–39 (2004)
Opfer, R.: Tight frame expansions of multiscale reproducing kernels in Sobolev spaces. Technical Report, Institut fur Numerische und Angewandte Mathematik, Universitt Gottingen (2004)
Amato, U., Antoniadis, A., Pensky, M.: Wavelet kernel penalized estimation for non-equispaced design regression. Statistics and Computing 16(1), 37–55 (2006)
Rakotomamonjy, A., Canu, S.: Frame, reproducing kernel, regularization and learning. Journal of Machine Learning Research 6, 1485–1515 (2005)
Aronszajn, N.: Theory of reproducing kernels. Trans. Amer. Math. Soc. 68, 337–404 (1950)
Walter, G.G.: A sample theorem for wavelet subspaces. IEEE Trans. Information Theory 38, 881–884 (1992)
Mallat, S.: A theory for multiresolution signal decomposition: The wavelet representation. IEEE Trans. PAMI 11, 674–693 (1989)
Mercer, J.: Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. Roy Soc. London A209, 415–446 (1909)
Chapelle, O., Vapnik, V.: Model selection for support vector machines. A. Solla. In: NIPS. Advances in Neural Information Processing Systems, vol. 12, pp. 349–355. MIT Press, Cambridge (2000)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley, New York (2002)
Girosi, F.: An Equivalence between Sparse Approximation and Support Vector Machines. Neural Computation 10, 1455–1480 (1998)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Xiangyang, M., Taiyi, Z., Yatong, Z. (2007). Scaling Kernels: A New Least Squares Support Vector Machine Kernel for Approximation. In: Gelbukh, A., Kuri Morales, Á.F. (eds) MICAI 2007: Advances in Artificial Intelligence. MICAI 2007. Lecture Notes in Computer Science(), vol 4827. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76631-5_37
Download citation
DOI: https://doi.org/10.1007/978-3-540-76631-5_37
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-76630-8
Online ISBN: 978-3-540-76631-5
eBook Packages: Computer ScienceComputer Science (R0)