Abstract
End-to-end learning is discussed in the framework of linear combinations of reproducing kernels associated with training samples. This paper shows that the leave-one-out (LOO) technique can be executed very efficiently in this framework. It is a simple extension of previous fast LOO algorithms for scalar-valued functions to vector-valued functions, but opens the door for multiple analyses of the same data with almost no cost. With a newly defined LOO matrix, we demonstrate the effectiveness and the universality of this approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. The MIT Press, Cambridge (2002)
Tai, M., et al.: Kernelized supervised Laplacian eigenmap for visualization and classification of multi-label data. Pattern Recogn. 123, 108399 (2022)
Magnus, J.R., Neudecker, H.: Matrix Differential Calculus with Applications in Statistics and Econometrics (Revised Version), p. 11. Wiley, Chichester (1999)
Knaf, H.: Kernel fisher discriminant functions - a concise and rigorous introduction. Technical report 117, Fraunhofer (ITWM) (2007)
Cawley, G.C., Talbot, N.L.C.: Fast exact leave-one-out cross-validation of sparse least-squares support vector machines. Neural Netw. 17, 1467–1475 (2004)
Cawley, G.C.: Leave-one-out cross-validation based model selection criteria for weighted LS-SVMs. In: Proceedings of the 2006 IEEE International Joint Conference on Neural Networks, Vancouver, pp. 1661–1668 (2006)
Cawley, G.C., Talbot, N.L.C.: Efficient approximate leave-one-out cross-validation for kernel logistic regression. Mach. Learn. 71, 243–264 (2008)
An, S., Liu, W., Venkatesh, S.: Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recogn. 40–8, 2154–2162 (2007)
Tanaka, A., Imai, H.: A fast cross-validation algorithm for kernel ridge regression by eigenvalue decomposition. IEICE Trans. Fundam. Electron. Commun. Comput. Sci. 9, 1317–1320 (2019)
François, C., et al.: Keras (2015). https://keras.io
Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Acknowledgment
This work was partially supported by JSPS KAKENHI (Grant Number 19H04128).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Kudo, M., Kimura, K., Morishita, S., Sun, L. (2022). Efficient Leave-One-Out Evaluation of Kernelized Implicit Mappings. In: Krzyzak, A., Suen, C.Y., Torsello, A., Nobile, N. (eds) Structural, Syntactic, and Statistical Pattern Recognition. S+SSPR 2022. Lecture Notes in Computer Science, vol 13813. Springer, Cham. https://doi.org/10.1007/978-3-031-23028-8_23
Download citation
DOI: https://doi.org/10.1007/978-3-031-23028-8_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-23027-1
Online ISBN: 978-3-031-23028-8
eBook Packages: Computer ScienceComputer Science (R0)