Abstract
In this paper, we aim to develop novel learning approaches for extracting invariant features from time series. Specifically, we implement an existing method of solving the generalized eigenproblem and use this to firstly implement the biologically inspired technique of slow feature analysis (SFA) originally developed by Wiskott and Sejnowski (Neural Comput 14:715–770, 2002) and a rival method derived earlier by Stone (Neural Comput 8(7):1463–1492, 1996). Secondly, we investigate preprocessing the data using echo state networks (ESNs) (Lukosevicius and Jaeger in Comput Sci Rev 3(3):127–149, 2009) and show that the combination of generalized eigensolver and ESN is very powerful as a more biologically plausible implementation of SFA. Thirdly, we also investigate the effect of higher-order derivatives as a smoothing constraint and show the overall smoothness in the output signal. We demonstrate the potential of our proposed techniques, benchmarked against state-of-the-art approaches, using datasets comprising artificial, MNIST digits and hand-written character trajectories.
Similar content being viewed by others
References
Abdullah A, Hussain A. A new biclustering technique based on crossing minimization. Neurocomputing. 2006;69(16):1882–96.
Antonelo E, Schrauwen B. Learning slow features with reservoir computing for biologically-inspired robot localization. Neural Netw. 2012;25:178–90.
Bache K, Lichman M. UCI machine learning repository. Irvine, CA: University of California, School of Information and Computer Science; 2013. http://archive.ics.uci.edu/ml.
Berkes P. Pattern recognition with slow feature analysis. Cognitive Sciences EPrint Archive (CogPrints) 4104, 2005.
Blaschke T, Berkes P, Wiskott L. What is the relation between slow feature analysis and independent component analysis? Neural Comput. 2006;18(10):2495–508.
Bush K, Anderson C. Modeling reward functions for incomplete state representations via echo state network. In: Neural Networks, 2005. Proceedings. 2005 IEEE international joint conference on IJCNN’05, Vol. 5. IEEE.
Cheema TA, Qureshi IM, Hussain A. Blind image deconvolution using space-variant neural network approach. Electron Lett. 2005;41(6):308–09.
Ding Y, Song Y, Fan S, Qu Z, Chen L. Specificity and generalization of visual perceptual learning in humans: an event-related potential study. Neuroreport. 2003;14(4):587–90.
Földiák P. Learning invariance from transformation sequences. Neural Comput. 1991;3(2):194–200.
Gou Z, Fyfe C. A canonical correlation neural network for multicollinearity and functional data. Neural Netw. 2004;17(2):285–93.
Gou Z, Fyfe C. A family of networks which perform canonical correlation analysis. Int J Knowl-Based Intell Eng Syst. 2001;5(2):76–82.
Green CS, Bavelier D. Exercising your brain: a review of human brain plasticity and training-induced learning. Psychol Aging. 2008;23(4):692.
Huang Y, Zhao J, Tian M, Zou Q, Luo S. Slow feature discriminant analysis and its application on handwritten digit recognition. In: Neural Networks, 2009. International joint conference on IJCNN 2009. IEEE, pp. 1294–7.
Jaeger H, Haas H. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science. 2004;304(5667):78–80.
Jaeger H. Short term memory in echo state networks. GMD-Forschungszentrum Informationstechnik. 2001.
Knowlton BJ, Mangels JA, Squire LR. A neostriatal habit learning system in humans. Science. 1996;273(5280):1399–402.
Kompella VR, Matthew L, Schmidhuber J. Incremental slow feature analysis: adaptive low-complexity slow feature updating from high-dimensional input streams. Neural Comput. 2012;24(11):2994–3024.
Legenstein R, Wilbert N, Wiskott L. Reinforcement learning on slow features of high-dimensional input streams. PLoS Comput Biol. 2010;6(8):e1000894.
LukošEvičIus M, Jaeger H. Reservoir computing approaches to recurrent neural network training. Comput Sci Rev. 2009;3(3):127–49.
Mangels JA, Butterfield B, Lamb J, Good C, Dweck CS. Why do beliefs about intelligence influence learning success? A social cognitive neuroscience model. Soc Cogn Affect Neurosci. 2006;1(2):75–86.
Peng D, Yi Z, Luo W. Convergence analysis of a simple minor component analysis algorithm. Neural Netw. 2007;20(7):842–50.
Plöger PG, Arghir A, Gunther T, Hosseiny R. Echo state networks for mobile robot modeling and control. In: RoboCup 2003: Robert Soccer World Cup V11. Springer Berlin Heidelberg, 2004; p. 157–68.
Qu Z, Song Y, Ding Y. ERP evidence for distinct mechanisms of fast and slow visual perceptual learning. Neuropsychologia. 2010;48(6):1869–74.
Schraudolph NN, Sejnowski TJ. Competitive anti-hebbian learning of invariants. In: NIPS. Vol. 4. 1991.
Skowronski MD, Harris JG. Minimum mean squared error time series classification using an echo state network prediction model. In: Circuits and Systems, 2006. Proceedings. 2006 IEEE International Symposium on ISCAS 2006. IEEE.
Stone JV. Learning perceptually salient visual parameters using spatiotemporal smoothness constraints. Neural Comput. 1996;8(7):1463–92.
Tong MH, Bickett AD, Christiansen EM, Cottrell GW. Learning grammatical structure with echo state networks. Neural Netw. 2007;20(3):424–32.
Turner R, Sahani M. A maximum-likelihood interpretation for slow feature analysis. Neural Comput. 2007;19(4):1022–38.
Wang TD, Fyfe C. Visualising temporal data using reservoir computing. J Inf Sci Eng. 2013;29(4):695–709.
Wang TD, Wu X, Fyfe C. Factors important for good visualisation of time series. Int J Comput Sci Eng. (in press).
Weng J, Zhang Y, Hwang W. Candid covariance-free incremental principal component analysis. Pattern analysis and machine intelligence, IEEE Trans. 2003;25(8):1034–40.
Werbos PJ. Intelligence in the brain: a theory of how it works and how to build it. Neural Netw. 2009;22(3):200–12.
Wiskott L, Sejnowski TJ. Slow feature analysis: unsupervised learning of invariances. Neural Comput. 2002;14(4):715–70.
Wiskott L. Estimating driving forces of nonstationary time series with slow feature analysis; 2003. arXiv preprint cond-mat/0312317.
Zhang Z, Zhao M, Chow TW. Binary- and multi-class group sparse canonical correlation analysis for feature extraction and classification. Knowl Data Eng, IEEE Trans. 2013;25(10):2192–205.
Zhang Q, Leung YW. A class of learning algorithms for principal component analysis and minor component analysis. Neural Netw, IEEE Trans. 2000;11(2):529–33.
Acknowledgments
The first author is grateful to Professor Colin Fyfe, formerly with the University of The West of Scotland, for his insightful suggestions, which helped improve the writing of this paper.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
See Table 2.
Rights and permissions
About this article
Cite this article
Malik, Z.K., Hussain, A. & Wu, J. Novel Biologically Inspired Approaches to Extracting Online Information from Temporal Data. Cogn Comput 6, 595–607 (2014). https://doi.org/10.1007/s12559-014-9257-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-014-9257-0