Abstract
Partial Least Squares Regression (PLS) and its kernel version (KPLS) have become competitive regression approaches. KPLS performs as well as or better than support vector regression (SVR) for moderately-sized problems with the advantages of simple implementation, less training cost, and easier tuning of parameters. Unlike SVR, KPLS requires manipulation of the full kernel matrix and the resulting regression function requires the full training data. In this paper we rigorously derive a sparse KPLS algorithm. The underlying KPLS algorithm is modified to maintain sparsity in all steps of the algorithm. The resulting ν-KPLS algorithm explicitly models centering and bias rather than using kernel centering. An ε-insensitive loss function is used to produce sparse solutions in the dual space. The final regression function for the ν-KPLS algorithm only requires a relatively small set of support vectors.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bach, F., Jordan, M.: Kernel independent component analysis. Journal of Machine Learning Research 3, 1–48 (2002)
Bennett, K., Embrechts, M.: An optimization perspective on kernel partial least quares. In: Proc. of NATO Adv. Study Inst. on Learning Theory and Prac. (2003)
Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Höskuldsson, A.: PLS Regression Methods. Journ. of Chemometrics 2, 211–228 (1988)
Platt, J.: Fast training of svm using sequential minimal optimization. In: Adv. in Kernel Methods - Support Vector Learning. MIT Press, Cambridge (1999)
Rosipal, R., Trejo, L.: Kernel partial least squares regression in reproducing kernel Hilbert space. Journal of Machine Learning Research 2, 97–123 (2001)
Schölkopf, B., Smola, A., Müller, K.-R.: Kernel principal component analysis. In: Adv. in Kernel Methods - Support Vector Learning. MIT Press, Cambridge (1999)
Schölkopf, B., Smola, A., Williamson, R.: Shrinking the tube: A new support regression algorithm. In: Adv. in Neural Info. Proc. Systems, vol. 11. MIT Press, Cambridge (1999)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1996)
Vinokourov, A., Shawe-Taylor, J., Cristianini, N.: Finding language-independent semantic representation of text using kernel canonical correlation analysis. Neuro- COLT Technical Report NC-TR-02-119 (2002)
Wold, H.: Estimation of principal components and related models by iterative least squares. In: Multivariate Analysis, New York. Academic Press, London (1966)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Momma, M., Bennett, K.P. (2003). Sparse Kernel Partial Least Squares Regression. In: Schölkopf, B., Warmuth, M.K. (eds) Learning Theory and Kernel Machines. Lecture Notes in Computer Science(), vol 2777. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-45167-9_17
Download citation
DOI: https://doi.org/10.1007/978-3-540-45167-9_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40720-1
Online ISBN: 978-3-540-45167-9
eBook Packages: Springer Book Archive