Abstract
Data mining refers to use of new methods for the intelligent analysis of large data sets. This paper applies one of nonlinear state space modeling (NSSM) techniques named nonlinear dynamical factor analysis (NDFA) to mine the latent factors which are the original sources for producing the observations of causal time series. The purpose of mining indirect sources rather than the time series observation is that much better results can be obtained from the latent sources, for example, economics data driven by an "explanatory variables" like inflation, unobserved trends and fluctuations. The effectiveness of NDFA is evaluated by a simulated time series data set. Our empirical study indicates the performance of NDFA is better than the independent component analysis in exploring the latent sources of Taiwan unemployment rate time series.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Makridakis, S.: Time series prediction: Forecasting the future and understanding the past. In: Weigend, A.S., Gershenfeld, N.A. (eds.), p. 643. Addison-Wesley Publishing Company, Reading (1993), ISBN 0-201-62; International Journal of Forecasting 10, 463–466 (1994)
Hu, X., Xu, P., Wu, S., Asgari, S., Bergsneider, M.: A data mining framework for time series estimation. Journal of Biomedical Informatics 43, 190–199 (2010)
Chen, C.T.: Linear System Theory and Design, 3rd edn. Oxford University Press, New York (1999)
Everitt, B.S., Dunn, G.: Applied Multivariate Data Analysis. Oxford University Press, New York (1992)
West, M., Harrison, J.: Bayesian Forecasting and Dynamic Models. Springer, New York (1990)
De Jong, P.: The diffuse Kalman filter Annals of Statistics 19 (1991)
Anderson, B.D.D., Moore, J.B.: Optimal filtering. Prentice-Hall, Englewood Cliffs (1979)
Ilin, A., Valpola, H., Oja, E.: Nonlinear dynamical factor analysis for state change detection. IEEE Transactions on Neural Networks 15, 559–575 (2004)
Overschee, P.v., Moor, B.D.: Subspace Identification for Linear Systems: Theory, Implementation Applications. Springer, Heidelberg (1996)
Quach, M., Brunel, N., d’Alché-Buc, F.: Estimating parameters and hidden variables in nonlinear state-space models based on ODEs for biological networks inference. Bioinformatics (2007)
Lappalainen, H., Honkela, A.: Bayesian Nonlinear Independent Component Analysis by Multi-Layer Perceptrons. In: Girolami, M. (ed.) Advances in Independent Component Analysis, pp. 93–121. Springer, Heidelberg (2000)
Valpola, H., Karhunen, J.: An unsupervised ensemble learning method for nonlinear dynamic state-space models. Neural Comput. 14, 2647–2692 (2002)
Giannakopoulos, X., Valpola, H.: Nonlinear dynamical factor analysis. In: Bayesian Inference And Maximum Entropy Methods in Science And Engineering: 20th International Workshop. AIP Conference Proceedings, vol. 568 (2001)
Barber, D., Bishop, C. (eds.): Ensemble learning in Bayesian neural networks. Springer, Berlin (1998)
Giannakopoulos, X., Valpola, H.: Nonlinear dynamical factor analysis. In: AIP Conference Proceedings, vol. 568, p. 305 (2001)
Honkela, A., Valpola, H.: Unsupervised variational Bayesian learning of nonlinear models. In: Saul, L.K., Weis, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems (NIPS 2004), vol. 17, pp. 593–600 (2005)
Valpola, H., Honkela, A., Giannakopoulos, X.: Matlab Codes for the NFA and NDFA Algorithms (2002), http://www.cis.hut.fi/projects/bayes/
Takens, F.: Detecting strange attractors in turbulence. LNM, vol. 898, pp. 366–381. Springer, Heidelberg (1981)
Fraser, A.M., Swinney, H.L.: Independent coordinates for strange attractors from mutual information. Physical Review AÂ 33, 1134 (1986)
Sprott, J.C.: Chaos and Time Series Analysis, vol. 507. Oxford University Press, Oxford (2003)
Naik, G.R., Kumar, D.K.: Determining Number of Independent Sources in Undercomplete Mixture. EURASIP Journal on Advances in Signal Processing 5, Article ID 694850 (2009), doi:10.1155/2009/694850
Gävert, H., Hurri, J., Särelä, J., Hyvärinen, A.: FastICA Package (2005), http://www.cis.hut.fi/projects/ica/fastica/code/dlcode.shtml
Everson, R., Roberts, S.: Inferring the eigenvalues of covariance matrices from limited, noisy data. IEEE Transactions on Signal Processing 48, 2083–2091 (2000)
Santos, J.e.D.A., Barreto, G.A., Medeiros, C.a.M.S.: Estimating the Number of Hidden Neurons of the MLP Using Singular Value Decomposition and Principal Components Analysis: A Novel Approach. In: 2010 Eleventh Brazilian Symposium on Neural Networks, pp. 19–24 (2010)
Honkela, A.: Approximating Nonlinear Transformations of Probability Distributions for Nonlinear Independent Component Analysis. In: Proceedings of the 2004 IEEE International Joint Conference on Neural Networks (IJCNN 2004), Budapest, Hungary, pp. 2169–2174 (2004)
Chen, W.-S.: Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series. Physica A: Statistical Mechanics and its Applications (in Press, 2011), doi:10.1016/j.physa.2010.12.020
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chen, WS., Yu, FJ. (2011). Mining Latent Sources of Causal Time Series Using Nonlinear State Space Modeling. In: Nguyen, N.T., Kim, CG., Janiak, A. (eds) Intelligent Information and Database Systems. ACIIDS 2011. Lecture Notes in Computer Science(), vol 6591. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-20039-7_14
Download citation
DOI: https://doi.org/10.1007/978-3-642-20039-7_14
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-20038-0
Online ISBN: 978-3-642-20039-7
eBook Packages: Computer ScienceComputer Science (R0)