Abstract
This paper gives a method of flexible hypersurface fitting with RBF kernel functions. In order to fit a hypersurface to a given set of points in an Euclidean space, we can apply the hyperplane fitting method to the points mapped to a high dimensional feature space. This fitting is equivalent to a one-dimensional reduction of the feature space by eliminating the linear space spanned by an eigenvector corresponding to the smallest eigenvalue of a variance covariance matrix of data points in the feature space. This dimension reduction is called minor component analysis (MCA), which solves the same eigenvalue problem as kernel principal component analysis and extracts the eigenvector corresponding to the least eigenvalue. In general, feature space is set to an Euclidean space, which is a finite Hilbert space. To consider an MCA for an infinite Hilbert space, a kernel MCA (KMCA), which leads to an MCA in reproducing kernel Hilbert space, should be constructed. However, the representer theorem does not hold for a KMCA since there are infinite numbers of zero-eigenvalues would appear in an MCA for the infinite Hilbert space. Then, the fitting solution is not determined uniquely in the infinite Hilbert space, contrary to there being a unique solution in a finite Hilbert space. This ambiguity of fitting seems disadvantageous because it derives instability in fitting, but it can realize flexible fitting. Based on this flexibility, this paper gives a hypersurface fitting method in the infinite Hilbert space with RBF kernel functions to realize flexible hypersurface fitting. Although some eigenvectors of the matrix defined from kernel function at each sample are considered, we have a candidate of a reasonable solution among the simulation result under a specific situation. It is seen that the flexibility of our method is still effective through simulations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Fujiki, J., Akaho, S.: Hypersurface fitting via Jacobian nonlinear PCA on Riemannian space. In: Real, P., Diaz-Pernil, D., Molina-Abril, H., Berciano, A., Kropatsch, W. (eds.) CAIP 2011, Part I. LNCS, vol. 6854, pp. 236–243. Springer, Heidelberg (2011)
R Development Core Team, R: A language and environment for statistical computing, R Foundation for Statistical Computing, Vienna, Austria (2008) ISBN3-900051-07-0, http://www.R-project.org
Schölkopf, B., Smola, A., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)
Schölkopf, B., Smola, A.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press (2001)
Tsuda, K.: Subspace Classifier in the Hilbert Space. Pattern Recognition Letters 20, 513–519 (1999)
Xu, L., Oja, E., Suen, C.: Modified Hebbian learning for curve and surface fitting. Neural Networks 5(3), 441–457 (1992)
Wahba, G.: Spline Models for Observational Data. SIAM (1990)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Fujiki, J., Akaho, S. (2013). Flexible Hypersurface Fitting with RBF Kernels. In: Wilson, R., Hancock, E., Bors, A., Smith, W. (eds) Computer Analysis of Images and Patterns. CAIP 2013. Lecture Notes in Computer Science, vol 8047. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40261-6_34
Download citation
DOI: https://doi.org/10.1007/978-3-642-40261-6_34
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40260-9
Online ISBN: 978-3-642-40261-6
eBook Packages: Computer ScienceComputer Science (R0)