Skip to main content

Nonparametric data selection for improvement of parametric neural learning: A cumulant-surrogate method

  • Oral Presentations: Theory Theory VI: Time Series
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN 96 (ICANN 1996)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1112))

Included in the following conference series:

  • 216 Accesses

Abstract

We introduce a nonparametric cumulant based statistical approach for detecting linear and nonlinear statistical dependences in non-stationary time series. The statistical dependence is detected by measuring the predictability which tests the null hypothesis of statistical independence, expressed in Fourier-space, by the surrogate method. Therefore, the predictability is defined as a higher-order cumulant based significance discriminating between the original data and a set of scrambled surrogate data which correspond to the null hypothesis of a non-causal relationship between past and present. In this formulation nonlinear and non-Gaussian temporal dependences can be detected in time series. Information about the predictability can be used for example to select regions where a temporal structure is visible in order to select data for training a neural network for prediction. The regions where only a noisy behavior is observed are therefore ignored avoiding in this fashion the learning of irrelevant noise which normally spoils the generalization characteristics of the neural network.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. M. Palus, Physica D 80, 186 (1995).

    Google Scholar 

  2. M. Palus, V. Albrecht and I. Dvorák, Physics Letters A175, 203 (1993).

    Google Scholar 

  3. J. Theiler, S. Eubank, A. Longtin, B. Galdrikian and J. Farmer, Physica D 58, 77 (1992).

    Google Scholar 

  4. G. Deco and B. Schürmann, Physical Review E 51, 1780 (1995).

    Google Scholar 

  5. G. Deco and W. Brauer, Neural Networks 8, 525 (1995).

    Google Scholar 

  6. G. Deco and D. Obradovic, “An Information-Theoretic Approach to Neural Computing” (Springer Verlag, New York, 1996).

    Google Scholar 

  7. M. Hénon, Comm. Math. Phys. 50, 69 (1976).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Christoph von der Malsburg Werner von Seelen Jan C. Vorbrüggen Bernhard Sendhoff

Rights and permissions

Reprints and permissions

Copyright information

© 1996 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Deco, G., Schürmann, B. (1996). Nonparametric data selection for improvement of parametric neural learning: A cumulant-surrogate method. In: von der Malsburg, C., von Seelen, W., Vorbrüggen, J.C., Sendhoff, B. (eds) Artificial Neural Networks — ICANN 96. ICANN 1996. Lecture Notes in Computer Science, vol 1112. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-61510-5_24

Download citation

  • DOI: https://doi.org/10.1007/3-540-61510-5_24

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-61510-1

  • Online ISBN: 978-3-540-68684-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics