Abstract
In this paper, a novel sparse kernel recursive least squares algorithm, namely the Projected Kernel Recursive Least Squares (PKRLS) algorithm, is proposed. In PKRLS, a simple online vector projection (VP) method is used to represent the similarity between the current input and the dictionary in a feature space. The use of projection method applies sufficiently the information contained in data to update our solution. Compared with the quantized kernel recursive least squares (QKRLS) algorithm, which is a kind of kernel adaptive filter using vector quantization (VQ) in input space, simulation results validate that PKRLS can achieve a comparable filtering performance in terms of sparse network sizes and testing mean square error.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Shawe-Taylor, J., Cristianini, N.: Kernel Method for Pattern Analysis. Combridge University, Cambridge (2004)
Liu, W., Príncipe, J.C., Haykin, S.: Kernel Adaptive Filtering: A Comprehensive Introduction. Wiley, New York (2010)
Liu, W., Pokharel, P.P., Príncipe, J.C.: The Kernel Least-mean-squares Algorithm. IEEE Trans. Signal Process. 56, 543–554 (2008)
Liu, W., Prncipe, J.C.: Kernel Ane projection algorithm. EURASIP J. Adv. Signal Process. 2008, 1–13 (2008)
Engel, Y., Mannor, S., Meir, R.: The kernel recursive least-squares algorithm. IEEE Trans. Signal Process. 52, 2275–2285 (2004)
Wu, Z., Shi, J., Zhang, X., Ma, W., Chen, B.: Kernel recursive maximum correntropy. Signal Process. 117, 11–16 (2015)
Platt, J.: A resource-allocating network for function interpolation. Neural Comput. 3, 213–225 (1991)
Liu, W., Park, I., Príncipe, J.C.: An information theoretic approach of designing sparse kernel adaptive filters. IEEE Trans. Neural Netw. 20, 1950–1961 (2009)
Richard, C., Bermudez, J.C.M., Honeine, P.: Online prediction of time series data with kernels. IEEE Trans. Signal Process. 57, 1058–1067 (2009)
Chen, B., Zhao, S., Zhu, P., Príncipe, J.C.: Quantized kernel least mean square algorithm. IEEE Trans. Neural Netw. Learn. Syst. 23, 22–32 (2012)
Chen, B., Zhao, S., Zhu, P., Príncipe, J.C.: Quantized kernel recursive least squares algorithm. IEEE Trans. Neural Networks Learn. Syst. 24, 1484–1491 (2013)
Nan, S., Sun, L., Chen, B., Lin, Z., Toh, K.A.: Density-dependent quantized least squares support vector machine for large data sets. IEEE Trans. Neural Networks Learn. Syst. 28, 1–13 (2015)
Wang, S., Zheng, Y., Duan, S., Wang, L., Tan, H.: Quantized kernel maximum correntropy and its mean square convergence analysis. Digit. Signal Process. 63, 164–176 (2017)
Honeine, P.: Approximation errors of online sparsification criteria. IEEE Trans. Signal Process. 63, 4700–4709 (2015)
Acknowledgments
The work was supported in part by the National Natural Science Foundation of China (Grant nos. 61374117,61004048, 61174137, and 61104064), the National Science Foundation of Jiang Su Province (Grant no. BK2010493), the grant from the Science and Technology Department of Sichuan Province (Grant no. 2014GZ0156), and the grant from the China Postdoctoral Science Foundation funded project (Grant no. 2012M510135).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Zhao, J., Zhang, H. (2017). Projected Kernel Recursive Least Squares Algorithm. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10634. Springer, Cham. https://doi.org/10.1007/978-3-319-70087-8_38
Download citation
DOI: https://doi.org/10.1007/978-3-319-70087-8_38
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70086-1
Online ISBN: 978-3-319-70087-8
eBook Packages: Computer ScienceComputer Science (R0)