Skip to main content

Scaling Kernels: A New Least Squares Support Vector Machine Kernel for Approximation

  • Conference paper
MICAI 2007: Advances in Artificial Intelligence (MICAI 2007)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4827))

Included in the following conference series:

  • 993 Accesses

Abstract

Support vector machines(SVM) have been introduced for pattern recognition and regression. But it was limited by the time consuming and the choice of kernel function in practical application. Motivated by the theory of multi-scale representations of signals and wavelet transforms, this paper presents a way for building a wavelet-based reproducing kernel Hilbert spaces (RKHS) which is a multiresolution scale subspace and its associate scaling kernel for least squares support vector machines (LS-SVM). The scaling kernel is constructed by using a scaling function with its different dilations and translations. Results on several approximation problems illustrate that the LS-SVM with scaling kernel can approximate arbitrary signal with multi-scale and owns better approximation performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Müller, K.R., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.: An introduction to kernel-based learning algorithms. IEEE Trans Neural Networks 12(2), 181–201 (2001)

    Article  Google Scholar 

  2. Cristianini, N., Taylor, J.S.: An Introduction to Support Vector machines. Cambridge University Press, Cambridge (2000)

    Google Scholar 

  3. Suykens, J.A.K., Gestel, T.V., Brabanter, J.D., et al.: Least Squares Support Vector Machines. World Scientific Pub. Co., Singapore (2002)

    MATH  Google Scholar 

  4. Suykens, J.A.K., Vandewalle, J.: Recurrent least squares support vector machines. IEEE Trans. Circuits and Syst. I. 47, 1109–1114 (2000)

    Article  Google Scholar 

  5. Zhang, L., Zhou, W.D., Jiao, L.C.: Wavelet support vector machine. IEEE Trans. System, Man, Cybernetics 34, 34–39 (2004)

    Article  Google Scholar 

  6. Opfer, R.: Tight frame expansions of multiscale reproducing kernels in Sobolev spaces. Technical Report, Institut fur Numerische und Angewandte Mathematik, Universitt Gottingen (2004)

    Google Scholar 

  7. Amato, U., Antoniadis, A., Pensky, M.: Wavelet kernel penalized estimation for non-equispaced design regression. Statistics and Computing 16(1), 37–55 (2006)

    Article  MathSciNet  Google Scholar 

  8. Rakotomamonjy, A., Canu, S.: Frame, reproducing kernel, regularization and learning. Journal of Machine Learning Research 6, 1485–1515 (2005)

    MathSciNet  Google Scholar 

  9. Aronszajn, N.: Theory of reproducing kernels. Trans. Amer. Math. Soc. 68, 337–404 (1950)

    Article  MATH  MathSciNet  Google Scholar 

  10. Walter, G.G.: A sample theorem for wavelet subspaces. IEEE Trans. Information Theory 38, 881–884 (1992)

    Article  Google Scholar 

  11. Mallat, S.: A theory for multiresolution signal decomposition: The wavelet representation. IEEE Trans. PAMI 11, 674–693 (1989)

    MATH  Google Scholar 

  12. Mercer, J.: Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. Roy Soc. London A209, 415–446 (1909)

    Article  Google Scholar 

  13. Chapelle, O., Vapnik, V.: Model selection for support vector machines. A. Solla. In: NIPS. Advances in Neural Information Processing Systems, vol. 12, pp. 349–355. MIT Press, Cambridge (2000)

    Google Scholar 

  14. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley, New York (2002)

    Google Scholar 

  15. Girosi, F.: An Equivalence between Sparse Approximation and Support Vector Machines. Neural Computation 10, 1455–1480 (1998)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Alexander Gelbukh Ángel Fernando Kuri Morales

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Xiangyang, M., Taiyi, Z., Yatong, Z. (2007). Scaling Kernels: A New Least Squares Support Vector Machine Kernel for Approximation. In: Gelbukh, A., Kuri Morales, Á.F. (eds) MICAI 2007: Advances in Artificial Intelligence. MICAI 2007. Lecture Notes in Computer Science(), vol 4827. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76631-5_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-76631-5_37

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-76630-8

  • Online ISBN: 978-3-540-76631-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics