Abstract
This paper is a continuation and extension of our previous research where kernel normalized mixed-norm (KNMN) algorithm, a combination of the kernel trick with the mixed-norm strategy, was proposed to demonstrate superior performance for system identification under non-Gaussian environment. Meanwhile, we also introduced a naive adaptive mixing parameter (AMP) updating mechanism to make KNMN more robust under nonstationary scenarios. The main contributions of this paper are threefold: firstly, the \(\ell _p\)-norm is substituted for the \(\ell _4\)-norm in the cost function, which can be viewed as a generalized version to the form of mixed-norms; secondly, instead of using the original AMP proposed in our previous work, a novel time-varying AMP is employed to provide better tracking behavior to the nonstationarity; and thirdly, the mean square convergence analysis is conducted, where the second moment behavior of weight error vector is elaborately studied. Simulations are conducted on two benchmark system identification problems, and different kinds of additive noises are added respectively to verify the effectiveness of improvements.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The posterior error \(e_p(k)\) defined here is different from \(e_{post}(k)\) in (10).
References
Aboulnasr, T., Mayyas, K.: A robust variable step-size lms-type algorithm: analysis and simulations. IEEE Trans. Sig. Process. 45(3), 631–639 (1997)
Chambers, J., Avlonitis, A.: A robust mixed-norm adaptive filter algorithm. Sig. Process. Lett. IEEE 4(2), 46–48 (1997)
Chen, B., Yuan, Z., Zheng, N., PrÃncipe, J.C.: Kernel minimum error entropy algorithm. Neurocomputing 121, 160–169 (2013)
Chen, B., Zhao, S., Zhu, P., PrÃncipe, J.C.: Mean square convergence analysis for kernel least mean square algorithm. Sig. Process. 92(11), 2624–2632 (2012)
Chuah, T.C., Sharif, B.S., Hinton, O.R.: Robust adaptive spread-spectrum receiver with neural net preprocessing in non-gaussian noise. IEEE Trans. Neural Netw. 12(3), 546–558 (2001)
Lee, C.H., Lin, C.R., Chen, M.S.: Sliding-window filtering: an efficient algorithm for incremental mining. In: Proceedings of the Tenth International Conference on Information and Knowledge Management, pp. 263–270. ACM (2001)
Liu, J., Qu, H., Chen, B., Ma, W.: Kernel robust mixed-norm adaptive filtering. In: 2014 International Joint Conference on Neural Networks (IJCNN), pp. 3021–3024. IEEE (2014)
Liu, W., Pokharel, P.P., PrÃncipe, J.C.: Correntropy: properties and applications in non-gaussian signal processing. IEEE Trans. Sig. Process. 55(11), 5286–5298 (2007)
Liu, W., Pokharel, P.P., Principe, J.C.: The kernel least-mean-square algorithm. IEEE Trans. Sig. Process. 56(2), 543–554 (2008)
Liu, W., Principe, J.C., Haykin, S.: Kernel Adaptive Filtering: A Comprehensive Introduction, vol. 57. Wiley, Hoboken (2011)
Liu, W., PrÃncipe, J.: Kernel affine projection algorithms. EURASIP J. Adv. Sig. Process. 2008(1), 784292 (2008)
Mandic, D.P., Papoulis, E.V., Boukis, C.G.: A normalized mixed-norm adaptive filtering algorithm robust under impulsive noise interference. In: 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings, (ICASSP 2003), vol. 6, pp. VI-333. IEEE (2003)
Richard, C., Bermudez, J.C.M., Honeine, P.: Online prediction of time series data with kernels. IEEE Trans. Sig. Process. 57(3), 1058–1067 (2009)
Sayed, A.H.: Fundamentals of Adaptive Filtering. Wiley, Hoboken (2003)
Slavakis, K., Giannakis, G., Mateos, G.: Modeling and optimization for big data analytics: (statistical) learning tools for our era of data deluge. Sig. Process. Mag. IEEE 31(5), 18–31 (2014)
Yu, S., You, X., Zhao, K., Ou, W., Tang, Y.: Kernel normalized mixed-norm algorithm for system identification. In: 2015 International Joint Conference on Neural Networks (IJCNN). IEEE (2015, in press)
Zidouri, A.: Convergence analysis of a mixed controlled \(\ell _2\)-\(\ell _p\) adaptive algorithm. EURASIP J. Adv. Sig. Process. 2010, 103 (2010)
Acknowledgments
This work is supported partially by the National Natural Science Foundation of China (no.61402122) and the 2014 Ph.D. Recruitment Program of Guizhou Normal University.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Yu, S. et al. (2015). Generalized Kernel Normalized Mixed-Norm Algorithm: Analysis and Simulations. In: Arik, S., Huang, T., Lai, W., Liu, Q. (eds) Neural Information Processing. ICONIP 2015. Lecture Notes in Computer Science(), vol 9490. Springer, Cham. https://doi.org/10.1007/978-3-319-26535-3_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-26535-3_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-26534-6
Online ISBN: 978-3-319-26535-3
eBook Packages: Computer ScienceComputer Science (R0)