Abstract
This paper investigates a robust kernel approximation scheme for support vector machine classification with indefinite kernels. It aims to tackle the issue that the indefinite kernel is contaminated by noises and outliers, i.e. a noisy observation of the true positive definite (PD) kernel. The traditional algorithms recovery the PD kernel from the observation with the small Gaussian noises, however, such way is not robust to noises and outliers that do not follow a Gaussian distribution. In this paper, we assume that the error is subject to a Gaussian-Laplacian distribution to simultaneously dense and sparse/abnormal noises and outliers. The derived optimization problem including the kernel learning and the dual SVM classification can be solved by an alternate iterative algorithm. Experiments on various benchmark data sets show the robustness of the proposed method when compared with other state-of-the-art kernel modification based methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The kernel matrix \(\mathbf {K}\) associated to a positive definite kernel \(\mathcal {K}\) is PSD.
- 2.
The probability density function of a Gaussian random variable x is defined as \(f_{\mathcal {N}}(x)=\frac{1}{\sqrt{2\pi }\sigma _N} \exp \big ( -\frac{\sqrt{2}x^2}{\sigma _N^2} \big )\).
- 3.
The probability density function of a Laplacian random variable x is defined as \(f_{\mathcal {L}}(x)=\frac{1}{\sqrt{2}\sigma _L} \exp \big ( -\frac{\sqrt{2}|x|}{\sigma _L} \big )\).
- 4.
Accuracy is defined as the percentage of total instances predicted correctly.
- 5.
Recall is the percentage of true positives that were correctly predicted positive.
References
Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2003)
Saigo, H., Vert, J.P., Ueda, N., Akutsu, T.: Protein homology detection using string alignment kernels. Bioinformatics 20(11), 1682–1689 (2004)
Graepel, T., Herbrich, R., Bollmann-Sdorra, P., Obermayer, K.: Classification on pairwise proximity data. In: Proceedings of Advances in Neural Information Processing Systems, vol. 11, pp. 438–444 (1999)
Pekalska, E., Paclik, P., Duin, R.P.W.: A generalized kernel approach to dissimilarity-based classification. J. Mach. Learn. Res. 2(2), 175–211 (2002)
Roth, V., Laub, J., Kawanabe, M., Buhmann, J.: Optimal cluster preserving embedding of nonmetric proximity data. IEEE Trans. Pattern Anal. Mach. Intell. 25(12), 1540–1551 (2003)
Luss, R., d’Aspremont, A.: Support vector machine classification with indefinite kernels. In: Proceedings of Advances in Neural Information Processing Systems, pp. 953–960 (2008)
Ying, Y., Campbell, C., Girolami, M.: Analysis of SVM with indefinite kernels. In: Proceedings of Advances in Neural Information Processing Systems, pp. 2205–2213 (2009)
Liu, F., Liu, M., Zhou, T., Qiao, Y., Yang, J.: Incremental robust nonnegative matrix factorization for object tracking. In: Proceedings of the International Conference on Neural Information Processing, pp. 611–619 (2016)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)
Platt, J.C.: \(\ell _2\) Fast training of support vector machines using sequential minimal optimization. In: Advances in Kernel Methods (1999)
Blake, C., Merz, C.J.: UCI repository of machine learning databases (1998). http://archive.ics.uci.edu/ml/
Hull, J.J.: A database for handwritten text recognition research. IEEE Trans. Pattern Anal. Mach. Intell. 16(5), 550–554 (1994)
Huang, X., Suykens, J.A., Wang, S., Hornegger, J., Maier, A.: Classification with truncated \(\ell _1\) distance kernel. In: IEEE Transactions on Neural Networks and Learning Systems (2017)
Xu, H., Xue, H., Chen, X., Wang, Y.: Solving indefinite kernel support vector machine with difference of convex functions programming. In: Proceedings of AAAI Conference on Artificial Intelligence, pp. 1610–1616 (2017)
Yuille, A.L., Rangarajan, A.: The concave-convex procedure. Neural Comput. 15(4), 915–936 (2003)
Ong, C.S., Mary, X., Smola, A.J.: Learning with non-positive kernels. In: Proceedings of International Conference on Machine Learning, pp. 81–89 (2004)
Loosli, G., Canu, S., Cheng, S.O.: Learning SVM in kreÄn spaces. IEEE Trans. Pattern Anal. Mach. Intell. 38(6), 1204–1216 (2016)
Huang, X., Maier, A., Hornegger, J., Suykens, J.A.K.: Indefinite kernels in least squares support vector machines and principal component analysis. Appl. Comput. Harmonic Anal. 43(1), 162–172 (2017)
Acknowledgment
This work was supported in part by the National Natural Science Foundation of China under Grant 61572315, Grant 6151101179, and Grant 61603248, in part by 863 Plan of China under Grant 2015AA042308.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Liu, F., Huang, X., Peng, C., Yang, J., Kasabov, N. (2017). Robust Kernel Approximation for Classification. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10634. Springer, Cham. https://doi.org/10.1007/978-3-319-70087-8_31
Download citation
DOI: https://doi.org/10.1007/978-3-319-70087-8_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70086-1
Online ISBN: 978-3-319-70087-8
eBook Packages: Computer ScienceComputer Science (R0)