Skip to main content
Log in

Novel Biologically Inspired Approaches to Extracting Online Information from Temporal Data

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

In this paper, we aim to develop novel learning approaches for extracting invariant features from time series. Specifically, we implement an existing method of solving the generalized eigenproblem and use this to firstly implement the biologically inspired technique of slow feature analysis (SFA) originally developed by Wiskott and Sejnowski (Neural Comput 14:715–770, 2002) and a rival method derived earlier by Stone (Neural Comput 8(7):1463–1492, 1996). Secondly, we investigate preprocessing the data using echo state networks (ESNs) (Lukosevicius and Jaeger in Comput Sci Rev 3(3):127–149, 2009) and show that the combination of generalized eigensolver and ESN is very powerful as a more biologically plausible implementation of SFA. Thirdly, we also investigate the effect of higher-order derivatives as a smoothing constraint and show the overall smoothness in the output signal. We demonstrate the potential of our proposed techniques, benchmarked against state-of-the-art approaches, using datasets comprising artificial, MNIST digits and hand-written character trajectories.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Abdullah A, Hussain A. A new biclustering technique based on crossing minimization. Neurocomputing. 2006;69(16):1882–96.

    Article  Google Scholar 

  2. Antonelo E, Schrauwen B. Learning slow features with reservoir computing for biologically-inspired robot localization. Neural Netw. 2012;25:178–90.

    Article  PubMed  Google Scholar 

  3. Bache K, Lichman M. UCI machine learning repository. Irvine, CA: University of California, School of Information and Computer Science; 2013. http://archive.ics.uci.edu/ml.

  4. Berkes P. Pattern recognition with slow feature analysis. Cognitive Sciences EPrint Archive (CogPrints) 4104, 2005.

  5. Blaschke T, Berkes P, Wiskott L. What is the relation between slow feature analysis and independent component analysis? Neural Comput. 2006;18(10):2495–508.

    Article  PubMed  Google Scholar 

  6. Bush K, Anderson C. Modeling reward functions for incomplete state representations via echo state network. In: Neural Networks, 2005. Proceedings. 2005 IEEE international joint conference on IJCNN’05, Vol. 5. IEEE.

  7. Cheema TA, Qureshi IM, Hussain A. Blind image deconvolution using space-variant neural network approach. Electron Lett. 2005;41(6):308–09.

    Google Scholar 

  8. Ding Y, Song Y, Fan S, Qu Z, Chen L. Specificity and generalization of visual perceptual learning in humans: an event-related potential study. Neuroreport. 2003;14(4):587–90.

    Google Scholar 

  9. Földiák P. Learning invariance from transformation sequences. Neural Comput. 1991;3(2):194–200.

    Google Scholar 

  10. Gou Z, Fyfe C. A canonical correlation neural network for multicollinearity and functional data. Neural Netw. 2004;17(2):285–93.

    Google Scholar 

  11. Gou Z, Fyfe C. A family of networks which perform canonical correlation analysis. Int J Knowl-Based Intell Eng Syst. 2001;5(2):76–82.

    Google Scholar 

  12. Green CS, Bavelier D. Exercising your brain: a review of human brain plasticity and training-induced learning. Psychol Aging. 2008;23(4):692.

    Google Scholar 

  13. Huang Y, Zhao J, Tian M, Zou Q, Luo S. Slow feature discriminant analysis and its application on handwritten digit recognition. In: Neural Networks, 2009. International joint conference on IJCNN 2009. IEEE, pp. 1294–7.

  14. Jaeger H, Haas H. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science. 2004;304(5667):78–80.

    Article  CAS  PubMed  Google Scholar 

  15. Jaeger H. Short term memory in echo state networks. GMD-Forschungszentrum Informationstechnik. 2001.

  16. Knowlton BJ, Mangels JA, Squire LR. A neostriatal habit learning system in humans. Science. 1996;273(5280):1399–402.

    Google Scholar 

  17. Kompella VR, Matthew L, Schmidhuber J. Incremental slow feature analysis: adaptive low-complexity slow feature updating from high-dimensional input streams. Neural Comput. 2012;24(11):2994–3024.

    Google Scholar 

  18. Legenstein R, Wilbert N, Wiskott L. Reinforcement learning on slow features of high-dimensional input streams. PLoS Comput Biol. 2010;6(8):e1000894.

    Google Scholar 

  19. LukošEvičIus M, Jaeger H. Reservoir computing approaches to recurrent neural network training. Comput Sci Rev. 2009;3(3):127–49.

    Article  Google Scholar 

  20. Mangels JA, Butterfield B, Lamb J, Good C, Dweck CS. Why do beliefs about intelligence influence learning success? A social cognitive neuroscience model. Soc Cogn Affect Neurosci. 2006;1(2):75–86.

    Google Scholar 

  21. Peng D, Yi Z, Luo W. Convergence analysis of a simple minor component analysis algorithm. Neural Netw. 2007;20(7):842–50.

    Google Scholar 

  22. Plöger PG, Arghir A, Gunther T, Hosseiny R. Echo state networks for mobile robot modeling and control. In: RoboCup 2003: Robert Soccer World Cup V11. Springer Berlin Heidelberg, 2004; p. 157–68.

  23. Qu Z, Song Y, Ding Y. ERP evidence for distinct mechanisms of fast and slow visual perceptual learning. Neuropsychologia. 2010;48(6):1869–74.

    Google Scholar 

  24. Schraudolph NN, Sejnowski TJ. Competitive anti-hebbian learning of invariants. In: NIPS. Vol. 4. 1991.

  25. Skowronski MD, Harris JG. Minimum mean squared error time series classification using an echo state network prediction model. In: Circuits and Systems, 2006. Proceedings. 2006 IEEE International Symposium on ISCAS 2006. IEEE.

  26. Stone JV. Learning perceptually salient visual parameters using spatiotemporal smoothness constraints. Neural Comput. 1996;8(7):1463–92.

    Article  CAS  PubMed  Google Scholar 

  27. Tong MH, Bickett AD, Christiansen EM, Cottrell GW. Learning grammatical structure with echo state networks. Neural Netw. 2007;20(3):424–32.

    Article  PubMed  Google Scholar 

  28. Turner R, Sahani M. A maximum-likelihood interpretation for slow feature analysis. Neural Comput. 2007;19(4):1022–38.

    Article  PubMed  Google Scholar 

  29. Wang TD, Fyfe C. Visualising temporal data using reservoir computing. J Inf Sci Eng. 2013;29(4):695–709.

    Google Scholar 

  30. Wang TD, Wu X, Fyfe C. Factors important for good visualisation of time series. Int J Comput Sci Eng. (in press).

  31. Weng J, Zhang Y, Hwang W. Candid covariance-free incremental principal component analysis. Pattern analysis and machine intelligence, IEEE Trans. 2003;25(8):1034–40.

    Google Scholar 

  32. Werbos PJ. Intelligence in the brain: a theory of how it works and how to build it. Neural Netw. 2009;22(3):200–12.

  33. Wiskott L, Sejnowski TJ. Slow feature analysis: unsupervised learning of invariances. Neural Comput. 2002;14(4):715–70.

    Article  PubMed  Google Scholar 

  34. Wiskott L. Estimating driving forces of nonstationary time series with slow feature analysis; 2003. arXiv preprint cond-mat/0312317.

  35. Zhang Z, Zhao M, Chow TW. Binary- and multi-class group sparse canonical correlation analysis for feature extraction and classification. Knowl Data Eng, IEEE Trans. 2013;25(10):2192–205.

    Google Scholar 

  36. Zhang Q, Leung YW. A class of learning algorithms for principal component analysis and minor component analysis. Neural Netw, IEEE Trans. 2000;11(2):529–33.

    Google Scholar 

Download references

Acknowledgments

The first author is grateful to Professor Colin Fyfe, formerly with the University of The West of Scotland, for his insightful suggestions, which helped improve the writing of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zeeshan Khawar Malik.

Appendix

Appendix

See Table 2.

Table 2 Euclidean distance matrix between a single ’a’ and the projected a, b, c, d and e on average filter for 20 a’s

Rights and permissions

Reprints and permissions

About this article

Cite this article

Malik, Z.K., Hussain, A. & Wu, J. Novel Biologically Inspired Approaches to Extracting Online Information from Temporal Data. Cogn Comput 6, 595–607 (2014). https://doi.org/10.1007/s12559-014-9257-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-014-9257-0

Keywords

Navigation