Skip to main content

Projected Kernel Recursive Least Squares Algorithm

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10634))

Included in the following conference series:

Abstract

In this paper, a novel sparse kernel recursive least squares algorithm, namely the Projected Kernel Recursive Least Squares (PKRLS) algorithm, is proposed. In PKRLS, a simple online vector projection (VP) method is used to represent the similarity between the current input and the dictionary in a feature space. The use of projection method applies sufficiently the information contained in data to update our solution. Compared with the quantized kernel recursive least squares (QKRLS) algorithm, which is a kind of kernel adaptive filter using vector quantization (VQ) in input space, simulation results validate that PKRLS can achieve a comparable filtering performance in terms of sparse network sizes and testing mean square error.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Shawe-Taylor, J., Cristianini, N.: Kernel Method for Pattern Analysis. Combridge University, Cambridge (2004)

    Book  MATH  Google Scholar 

  2. Liu, W., Príncipe, J.C., Haykin, S.: Kernel Adaptive Filtering: A Comprehensive Introduction. Wiley, New York (2010)

    Book  Google Scholar 

  3. Liu, W., Pokharel, P.P., Príncipe, J.C.: The Kernel Least-mean-squares Algorithm. IEEE Trans. Signal Process. 56, 543–554 (2008)

    Article  MathSciNet  Google Scholar 

  4. Liu, W., Prncipe, J.C.: Kernel Ane projection algorithm. EURASIP J. Adv. Signal Process. 2008, 1–13 (2008)

    Article  Google Scholar 

  5. Engel, Y., Mannor, S., Meir, R.: The kernel recursive least-squares algorithm. IEEE Trans. Signal Process. 52, 2275–2285 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  6. Wu, Z., Shi, J., Zhang, X., Ma, W., Chen, B.: Kernel recursive maximum correntropy. Signal Process. 117, 11–16 (2015)

    Article  Google Scholar 

  7. Platt, J.: A resource-allocating network for function interpolation. Neural Comput. 3, 213–225 (1991)

    Article  MathSciNet  Google Scholar 

  8. Liu, W., Park, I., Príncipe, J.C.: An information theoretic approach of designing sparse kernel adaptive filters. IEEE Trans. Neural Netw. 20, 1950–1961 (2009)

    Article  Google Scholar 

  9. Richard, C., Bermudez, J.C.M., Honeine, P.: Online prediction of time series data with kernels. IEEE Trans. Signal Process. 57, 1058–1067 (2009)

    Article  MathSciNet  Google Scholar 

  10. Chen, B., Zhao, S., Zhu, P., Príncipe, J.C.: Quantized kernel least mean square algorithm. IEEE Trans. Neural Netw. Learn. Syst. 23, 22–32 (2012)

    Article  Google Scholar 

  11. Chen, B., Zhao, S., Zhu, P., Príncipe, J.C.: Quantized kernel recursive least squares algorithm. IEEE Trans. Neural Networks Learn. Syst. 24, 1484–1491 (2013)

    Article  Google Scholar 

  12. Nan, S., Sun, L., Chen, B., Lin, Z., Toh, K.A.: Density-dependent quantized least squares support vector machine for large data sets. IEEE Trans. Neural Networks Learn. Syst. 28, 1–13 (2015)

    Google Scholar 

  13. Wang, S., Zheng, Y., Duan, S., Wang, L., Tan, H.: Quantized kernel maximum correntropy and its mean square convergence analysis. Digit. Signal Process. 63, 164–176 (2017)

    Article  Google Scholar 

  14. Honeine, P.: Approximation errors of online sparsification criteria. IEEE Trans. Signal Process. 63, 4700–4709 (2015)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

The work was supported in part by the National Natural Science Foundation of China (Grant nos. 61374117,61004048, 61174137, and 61104064), the National Science Foundation of Jiang Su Province (Grant no. BK2010493), the grant from the Science and Technology Department of Sichuan Province (Grant no. 2014GZ0156), and the grant from the China Postdoctoral Science Foundation funded project (Grant no. 2012M510135).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ji Zhao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Zhao, J., Zhang, H. (2017). Projected Kernel Recursive Least Squares Algorithm. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10634. Springer, Cham. https://doi.org/10.1007/978-3-319-70087-8_38

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-70087-8_38

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-70086-1

  • Online ISBN: 978-3-319-70087-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics