Abstract
The recently proposed Krĕin space Support Vector Machine (KSVM) is an efficient classifier for indefinite learning problems, but with a non-sparse decision function. This very dense decision function prevents practical applications due to a costly out of sample extension. In this paper we provide a post processing technique to sparsify the obtained decision function of a Krĕin space SVM and variants thereof. We evaluate the influence of different levels of sparsity and employ a Nyström approach to address large scale problems. Experiments show that our algorithm is similar efficient as the non-sparse Krĕin space Support Vector Machine but with substantially lower costs, such that also large scale problems can be processed.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
Obtained by evaluating k(x, y) for training points x, y.
- 2.
A similar strategy for KSVM may be possible but is much more complicated because typically quite many points are support vectors and special sparse SVM solvers would be necessary.
References
Alabdulmohsin, I.M., Cissé, M., Gao, X., Zhang, X.: Large margin classification with indefinite similarities. Mach. Learn. 103(2), 215–237 (2016)
Duin, R.P.W., Pekalska, E.: Non-euclidean dissimilarities: causes and informativeness. In: Hancock, E.R., Wilson, R.C., Windeatt, T., Ulusoy, I., Escolano, F. (eds.) SSPR /SPR. LNCS, vol. 6218, pp. 324–333. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-14980-1_31
Geoffrey, Z.Z., Davis, M., Mallat, S.G.: Adaptive time-frequency decompositions. SPIE J. Opt. Eng. 33(1), 2183–2191 (1994)
Gisbrecht, A., Schleif, F.-M.: Metric and non-metric proximity transformations at linear costs. Neurocomputing 167, 643–657 (2015)
Gusfield, D.: Algorithms on Strings, Trees, and Sequences: Computer Science and Computational Biology. Cambridge University Press, Cambridge (1997)
Hassibi, B.: Indefinite metric spaces in estimation, control and adaptive filtering. Ph.D. thesis, Stanford University, Department of Electrical Engineering, Stanford (1996)
Hodgetts, C.J., Hahn, U.: Similarity-based asymmetries in perceptual matching. Acta Psychol. 139(2), 291–299 (2012)
Ling, H., Jacobs, D.W.: Shape classification using the inner-distance. IEEE Trans. Pattern Anal. Mach. Intell. 29(2), 286–299 (2007)
Loosli, G., Canu, S., Ong, C.S.: Learning SVM in Krein spaces. IEEE Trans. Pattern Anal. Mach. Intell. 38(6), 1204–1216 (2016)
Luss, R., d’Aspremont, A.: Support vector machine classification with indefinite kernels. Math. Program. Comput. 1(2–3), 97–118 (2009)
Mwebaze, E., Schneider, P., Schleif, F.-M., et al.: Divergence based classification in learning vector quantization. Neurocomputing 74, 1429–1435 (2010)
Olshausen, B.A., Field, D.J.: Sparse coding with an overcomplete basis set: a strategy employed by V1? Vis. Res. 37(23), 3311–3325 (1997)
Ong, C.S., Mary, X., Canu, S., Smola, A.J.: Learning with non-positive kernels. In: (ICML 2004) (2004)
Pati, Y.C., Rezaiifar, R., Krishnaprasad, P.S.: Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition. In Proceedings of the 27th Asilomar Conference on Signals, Systems and Computers, vol. 1, pp. 40–44, November 1993
Pekalska, E., Duin, R.: The Dissimilarity Representation for Pattern Recognition. World Scientific, Singapore (2005)
Pekalska, E., Haasdonk, B.: Kernel discriminant analysis for positive definite and indefinite kernels. IEEE Trans. Pattern Anal. Mach. Intell. 31(6), 1017–1031 (2009)
Scheirer, W.J., Wilber, M.J., Eckmann, M., Boult, T.E.: Good recognition is non-metric. Pattern Recogn. 47(8), 2721–2731 (2014)
Schleif, F.-M., Tiño, P.: Indefinite proximity learning: a review. Neural Comput. 27(10), 2039–2096 (2015)
Schleif, F.-M., Tiño, P.: Indefinite core vector machine. Pattern Recogn. 71, 187–195 (2017)
Schnitzer, D., Flexer, A., Widmer, G.: A fast audio similarity retrieval method for millions of music tracks. Multimed. Tools Appl. 58(1), 23–40 (2012)
Srisuphab, A., Mitrpanont, J.L.: Gaussian kernel approx algorithm for feedforward neural network design. Appl. Math. Comp. 215(7), 2686–2693 (2009)
Tsang, I.H., Kwok, J.Y., Zurada, J.M.: Generalized core vector machines. IEEE TNN 17(5), 1126–1140 (2006)
UCI: Skin segmentation database, March 2016
Vapnik, V.N.: The Nature of Statistical Learning Theory. Statistics for Engineering and Information Science. Springer, New York (2000)
Acknowledgment
We would like to thank Gaelle Bonnet-Loosli for providing support with the Krĕin Space SVM.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Schleif, FM., Raab, C., Tino, P. (2018). Sparsification of Indefinite Learning Models. In: Bai, X., Hancock, E., Ho, T., Wilson, R., Biggio, B., Robles-Kelly, A. (eds) Structural, Syntactic, and Statistical Pattern Recognition. S+SSPR 2018. Lecture Notes in Computer Science(), vol 11004. Springer, Cham. https://doi.org/10.1007/978-3-319-97785-0_17
Download citation
DOI: https://doi.org/10.1007/978-3-319-97785-0_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-97784-3
Online ISBN: 978-3-319-97785-0
eBook Packages: Computer ScienceComputer Science (R0)