Abstract
Recently, evolving multiple kernel learning methods have attracted researchers’ attention due to the ability to find the composite kernel with the optimal mapping model in a large high-dimensional feature space. However, it is not suitable to compute the composite kernel in a stationary way for all samples. In this paper, we propose a method of evolving non-stationary multiple kernel learning, in which base kernels are encoded as tree kernels and a gating function is used to determine the weights of the tree kernels simultaneously. Obtained classifiers have the composite kernel with the optimal mapping model and select the most appropriate combined weights according to the input samples. Experimental results on several UCI datasets illustrate the validity of proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Vapnik, V.: The nature of statistical learning theory. Springer (1999)
Lanckriet, G.R., Cristianini, N., Bartlett, P., et al.: Learning the kernel matrix with semidefinite programming. The Journal of Machine Learning Research 5, 27–72 (2004)
Bach, F.R., Lanckriet, G.R., Jordan, M.I.: Multiple kernel learning, conic duality, and the smo algorithm. In: Proceedings of the Twenty-first International Conference on Machine Learning, pp. 41–48. ACM (2004)
Sonnenburg, S., Rätsch, G., Schäfer, C., Schölkopf, B.: Large scale multiple kernel learning. The Journal of Machine Learning Research 7, 1531–1565 (2006)
Rakotomamonjy, A., Bach, F., Canu, S., Grandvalet, Y.: SimpleMKL. Journal of Machine Learning Research 9, 2491–2521 (2008)
Wu, P., Duan, F., Guo, P.: A pre-selecting base kernel method in multiple kernel learning. Neurocomputing (accepted, 2014)
Sullivan, K.M., Luke, S.: Evolving kernels for support vector machine classification. In: Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, pp. 1702–1707. ACM (2007)
Methasate, I., Theeramunkong, T.: Kernel trees for support vector machines. IEICE Transactions on Information and Systems 90(10), 1550–1556 (2007)
Dioşan, L., Rogozan, A., Pecuchet, J.P.: Improving classification performance of support vector machine by genetically optimising kernel shape and hyper-parameters. Applied Intelligence 36(2), 280–294 (2012)
Lewis, D.P., Jebara, T., Noble, W.S.: Nonstationary kernel combination. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 553–560. ACM (2006)
Gönen, M., Alpaydin, E.: Localized algorithms for multiple kernel learning. Pattern Recognition 46, 795–807 (2013)
Cho, Y., Saul, L.K.: Kernel methods for deep learning. In: Advances in Neural Information Processing Systems, pp. 342–350 (2009)
Bache, K., Lichman, M.: UCI machine learning repository. University of California, School of Information and Computer Science, Irvine (2013), http://archive.ics.uci.edu/ml
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Wu, P., Yin, Q., Guo, P. (2014). Method of Evolving Non-stationary Multiple Kernel Learning. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8835. Springer, Cham. https://doi.org/10.1007/978-3-319-12640-1_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-12640-1_3
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12639-5
Online ISBN: 978-3-319-12640-1
eBook Packages: Computer ScienceComputer Science (R0)