Elsevier

Neurocomputing

Volume 14, Issue 2, 5 February 1997, Pages 123-138
Neurocomputing

Paper
Financial time series modelling with discounted least squares backpropagation

https://doi.org/10.1016/S0925-2312(96)00005-7Get rights and content

Abstract

We propose a simple modification to the error backpropagation procedure which takes into account gradually changing input-output relations. The procedure is based on the principle of Discounted least squares whereby learning is biased towards more recent observations with long term effects experiencing exponential decay through time. This is particularly important in systems in which the structural relationship between input and response vectors changes gradually over time but certain elements of long term memory are still retained. The procedure is implemented by a simple modification of the least-squares cost function commonly used in error backpropagation. We compare the performance of the two cost functions using both a controlled simulation experiment and a non-trivial application in estimating stock returns on the basis of multiple factor exposures. We show that in both cases the DLS procedure gives significantly better results. Typically, there is an average improvement of above 30% (in MSE terms) for the stock return modelling problem.

References (9)

  • A. Banerjee et al.
  • W.A. Barnett et al.
  • P.L. Bartlett

    Learning with a slowly changing distribution

  • C.R. Bonini et al.

    Forecasting by smoothed regression

There are more references available in the full text version of this article.

Cited by (36)

  • Three-level network analysis of the North American natural gas price: A multiscale perspective

    2020, International Review of Financial Analysis
    Citation Excerpt :

    Therefore, it is suitable to study the North American natural gas market from a multiscale perspective. There are many methods for studying financial time series, such as discounted least squares backpropagation (Refenes, Bentz, Bunn, et al., 1997), mixed multiple linear regression (Kalashnikov, Matis, & Pérez-Valdés, 2010), Hilbert-Huang transform (Huang, Wu, Qu, et al., 2003), Diag-BEKK model (Boldanov, Degiannakis, & Filis, 2016) and rough set model (Yao & Herbert, 2009). But these are all focused on the time series itself.

  • Using Volume Weighted Support Vector Machines with walk forward testing and feature selection for the purpose of creating stock trading strategy

    2015, Expert Systems with Applications
    Citation Excerpt :

    In many further studies, an attempt was made to increase models’ performance. Refenes, Bentz, Bunn, Burgess, and Zapranis (1997) proposed a modification of least-squares cost function used in error backpropagation. Obtained estimator was biased towards the more recent observations, and the bias was determined by the decay parameter in sigmoid function.

  • High-performance Concrete Compressive Strength Prediction using Time-Weighted Evolutionary Fuzzy Support Vector Machines Inference Model

    2012, Automation in Construction
    Citation Excerpt :

    Thus, estimation requires development of a more advanced time series prediction algorithm, such as that achieved using an AI approach. Refenes et al. [36] described structural change as a time series data characteristic that should always be taken into account in all methodological approaches to time-series analysis. In light of this characteristic, Cao et al. [8] expressed that recent data provide more relevant information than distant data.

  • Flexible neural trees ensemble for stock index modeling

    2007, Neurocomputing
    Citation Excerpt :

    By highlighting the advantages and overcoming the limitations of both the neural networks technique and rule-based systems technique, the hybrid approach can facilitate the development of more reliable intelligent systems to model expert thinking and to support the decision-making processes. Refenes et al. [24] proposed a simple modification to the error backpropagation procedure which takes into account gradually changing input–output relations. The procedure is based on the principle of discounted least squares whereby learning is biased towards more recent observations with long term effects experiencing exponential decay through time.

View all citing articles on Scopus

This research is supported by the Department of Trade and Industry under the NCTT programme and the corporate members of the NeuroForecasting Club. We would like to thank Barclays-BZW, Citibank International, Mars Group, Postel Investment Management, Sabre Fund Management, and Societe Generale, for their material and technical support.

View full text