Skip to main content

A Method to Construct the Mapping to the Feature Space for the Dot Product Kernels

  • Conference paper
Book cover Advances in Machine Learning and Cybernetics

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3930))

  • 840 Accesses

Abstract

Dot product kernels are a class of important kernel in the theory of support vector machine. This paper develops a method to construct the mapping that map the original data set into the high dimensional feature space, on which the inner product is defined by a dot product kernel. Our method can also be applied to the Gaussian kernels. Via this mapping, the structure of features in the feature space is easy to be observed, and the linear separability of data sets in the feature space is studied. We obtain that any two finite sets of data with empty overlap in the original space will become linearly separable in an infinite dimensional feature space, and a sufficient and necessary condition is also developed for two infinite sets of data in the original data space being linearly separable in the feature space, this condition can be applied to examine the existences and uniqueness of the hyperplane which can separate all the possible inputs correctly.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    MATH  Google Scholar 

  2. Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  3. Burges, C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery 2(2), 121–167 (1998)

    Article  Google Scholar 

  4. Schurmann, J.: Pattern Classification: A Unified View of Statistical and Neural Approaches. Wiley, New York (1996)

    Google Scholar 

  5. Micchelli, C.A.: Algebraic Aspects of Interpolation. In: Proceedings of Symposia in Applied Mathematics, vol. 36, pp. 81–102 (1986)

    Google Scholar 

  6. Scholkopf, B., Smola, A.J.: Learning with Kernels. MIT Press, Cambridge (2002)

    Google Scholar 

  7. Smola, A.J.: Regression Estimation with Support Vector Learning Machines, Diplomarbeit, Technische Universitat Munchen (1996)

    Google Scholar 

  8. Schoenberg, I.J.: Positive Definite Functions on Spheres. Duke Mathematical Journal 9, 96–108 (1942)

    Article  MATH  MathSciNet  Google Scholar 

  9. Steinwart, I.: On the Influence of the Kernel on the Consistency of Support Vector Machines. Journal of Machine Learning Research 2, 67–93 (2001)

    Article  MathSciNet  Google Scholar 

  10. Saunders, C., Stitson, M.O., Weston, J., Bottou, L., Scholkopf, B., Smola, A.J.: Support Vector Machine Reference Manual. Technical Report CSD-TR-98-03, Department of Computer Science, Royal Holloway, University of London, Egham, UK (1998)

    Google Scholar 

  11. Degang, C., Qiang, H., Xizhao, W.: The infinite polynomial kernel for support vector machine. In: Li, X., Wang, S., Dong, Z.Y. (eds.) ADMA 2005. LNCS (LNAI), vol. 3584, pp. 267–275. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  12. Hein, M., Bousquet, O., Scholkopf, B.: Maximal margin classification for metric spaces. Journal of Computer and System Sciences 71, 333–359 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  13. Micchelli, C.A.: Algebraic aspects of interpolation. In: Proceedings of Symposia in Applied Mathematics, vol. 36, pp. 81–102 (1986)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Chen, D., He, Q., Dong, C., Wang, X. (2006). A Method to Construct the Mapping to the Feature Space for the Dot Product Kernels. In: Yeung, D.S., Liu, ZQ., Wang, XZ., Yan, H. (eds) Advances in Machine Learning and Cybernetics. Lecture Notes in Computer Science(), vol 3930. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11739685_96

Download citation

  • DOI: https://doi.org/10.1007/11739685_96

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-33584-9

  • Online ISBN: 978-3-540-33585-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics