Skip to main content

Least Squares Support Vector Machine Based on Continuous Wavelet Kernel

  • Conference paper
Advances in Neural Networks – ISNN 2005 (ISNN 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3496))

Included in the following conference series:

  • 1367 Accesses

Abstract

Based on the continuous wavelet transform theory and conditions of the admissible support vector kernel, a novel notion of multidimensional wavelet kernels is proposed for Least Squares Support Vector Machine (LS-WSVM) for pattern recognition and function estimation. Theoretic analysis of the wavelet kernel is discussed in detail. The good approximation property of wavelet kernel function enhances the generalization ability of LS-WSVM method and some experimental results are presented to illustrate the effectiveness and feasibility of the proposed method.

This work was supported by the national 973 key fundamental research project of China under grant 2002CB312200 and national 863 high technology projects foundation of China under grant 2002AA412010.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Poggio, T., Rifkin, R., Mukherjee, S., Niyogi, P.: General Conditions for Predictivity in Learning Theory. Nature 428, 419–422 (2004)

    Article  Google Scholar 

  2. Vapnik, V.: The Nature of Statistical Learning Theory, 2nd edn. Springer, Heidelberg (1998)

    Google Scholar 

  3. Burges, C.J.C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining Knowl. Disc. 2, 1–47 (1998)

    Google Scholar 

  4. Drucker, H., Burges, C.J.C., Kaufman, L.: Support Vector Regression Machines. In: Mozer, M., Jordan, M., Petsche, T. (eds.) Advances in Neural Information Processing Systems, vol. 9, pp. 155–161. MIT Press, Cambridge (1997)

    Google Scholar 

  5. Smola, A.J., Schölkopf, B.: A Tutorial on Support Vector Regression. Neuro COLT Technical Report NC-TR-98-030, Royal Holloway College, University of London, UK (1998)

    Google Scholar 

  6. Qinghua, Z.: Benvenisite: Wavelet Networks. IEEE Transactions on Neural Networks 3, 889–898 (1992)

    Article  Google Scholar 

  7. Daubechies, I.: Ten Lectures on Wavelets. CBMS-NSF Conference Series in Applied Mathematics 137, 117–119 (1992)

    Google Scholar 

  8. Onural, L., Kocatepe, M.: A Class of Wavelet Kernels Associated with Wave Propagation. In: Acoustics, Speech, and Signal Processing, ICASSP 1994, vol. 3, pp. 9–12 (1994)

    Google Scholar 

  9. Rakotomamonjy, A., Canu, S.: Frame Kernel for Learning. In: Dorronsoro, J.R. (ed.) ICANN 2002. LNCS, vol. 2415, pp. 707–712. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  10. Li, Z., Weida, Z., Licheng, J.: Wavelet Support Vector Machine. IEEE Transactions on Systems, Man, and Cybernetics. Part B: Cybernetics 34, 34–39 (2004)

    Article  Google Scholar 

  11. Scholkopf, B.: A Generalized Representer Theorem. Technical Report 2000-81, Neuro- Colt2 Technical Report Series (2000)

    Google Scholar 

  12. Evgeniou, T., Pontil, M., Poggio, T.: Regularization Networks and Support Vector Machines. Advances in Computational mathematics 13, 1–50 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  13. Suykens, J., Horvath, G., Basu, S., Micchelli, C., Vandewalle, J. (eds.): Advances in Learning Theory: Methods, Models and Applications. NATO Science Series III: Computer and Systems Sciences, vol. 190, pp. 89–110. IOS Press, Amsterdam (2003)

    MATH  Google Scholar 

  14. Saunders, C., Gammerman, A., Vovk, V.: Ridge Regression Learning Algorithm in Dual Variables. In: Proceedings of the 15th International Conference on Machine Learning ICML 1998, Madison-Wisconsin (1998)

    Google Scholar 

  15. Suykens, J.A.K., Vandewalle, J.: Least Squares Support Vector Machine Classifiers. Neural Processing Letters 9, 293–300 (1999)

    Article  MathSciNet  Google Scholar 

  16. Mercer, J.: Functions of Positive and Negative Type and Their Connection with the Theory of Integral Equations. Transactions of the London Philosophical Society A 209, 415–446 (1909)

    Article  Google Scholar 

  17. Aronszajn, N.: Theory of Reproducing Kernels. Transactions of the American Society 68, 337–404 (1950)

    MATH  MathSciNet  Google Scholar 

  18. Cherkassky, V.: Yunqian Ma: Practical selection of SVM Parameters and the Noise Estimation for SVM regression. Neural Network, 113–126 (2004)

    Google Scholar 

  19. Mallat, S.: A Wavelet Tour of Signal Processing, 2nd edn. Academic Press, London (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wen, X., Cai, Y., Xu, X. (2005). Least Squares Support Vector Machine Based on Continuous Wavelet Kernel. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_135

Download citation

  • DOI: https://doi.org/10.1007/11427391_135

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-25912-1

  • Online ISBN: 978-3-540-32065-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics