Elsevier

Neurocomputing

Volume 72, Issues 16–18, October 2009, Pages 3529-3537
Neurocomputing

A self-organising mixture autoregressive network for FX time series modelling and prediction

https://doi.org/10.1016/j.neucom.2009.03.019Get rights and content

Abstract

Nowadays a great deal of effort has been made in order to gain advantages in foreign exchange (FX) rates predictions. However, most existing techniques seldom excel the simple random walk model in practical applications. This paper describes a self-organising network formed on the basis of a mixture of adaptive autoregressive models. The proposed network, termed self-organising mixture autoregressive (SOMAR) model, can be used to describe and model nonstationary, nonlinear time series by means of a number of underlying local regressive models. An autocorrelation coefficient-based measure is proposed as the similarity measure for assigning input samples to the underlying local models. Experiments on both benchmark time series and several FX rates have been conducted. The results show that the proposed method consistently outperforms other local time series modelling techniques on a range of performance measures including the mean-square-error, correct trend predication percentage, accumulated profit and model variance.

Introduction

Foreign exchange (FX) or forex markets have grown rapidly in recent years and have become by far the largest financial market in the world with average daily trade in the global forex and related markets were reported to be over US$ 4 trillion in April 2007.1 FX rate forecasting has been an active and challenging area of research and has attracted a great deal of attention ever since the collapse of the Bretton–Woods system in 1973. The trend analysis in spot FX rates has been a recurrent theme among statisticians, econometricians and computer scientists. A fundamental approach is to use economic theories to underline the structural relations between exchange rates and other variables (e.g. interest rates and consumer price index) and to use statistical methods to identify the correlation between the past data and its future moves. Researchers have devoted a great deal of effort to developing various techniques in order to build a valid model. However, most econometric and time series techniques often fail to outdo the simplest random walk especially in short durations [21], [12]. The reason is that most of econometric models are linear and are used under specific or strict assumptions. For instance, autoregressive (AR) and autoregressive moving average (ARMA) models assume a linear relationship between the current value of the variables and the previous values and error terms. The mean and variance of the variables are regarded as constant over time. In other words, the process is assumed to be stationary. In practice, these conditions cannot be met.

In this paper, adaptive neural networks, in particular self-organising maps (SOM), are explored in modelling FX time series in conjunction with regressive models. The approach uses SOM to divide a time series into a number of homogeneous processes and these processes are then modelled by the nodes of an enhanced temporal SOM. The proposed network is termed self-organising mixture autoregressive (SOMAR) model, which uses AR models as components in the construction of the topological mixture model. A brief review on AR and related regressive models is given first as below.

The statistical AR model has been a primary tool in modelling time series, including financial time series [18]. The ARMA is the extended version of the AR model. Econometricians also use the generalised autoregressive conditional heteroskedastic (GARCH) [1] to further model the volatilities or variances of a financial time series.

The notation AR(p) refers to the autoregressive model of order p. The AR(p) model is written asxt=c+i=1pφixt-i+εt=c+xt-1(p)Tw+εt,where w=[φ1,,φp]T are the parameters of the model, xt-1(p)=[xt-1, xt-2,,xt-p]T is a concatenated input vector, c is a constant and εt is white noise with zero mean and variance σ2. The process is either random walk, when xt exhibits a unit root, or covariance-stationary if the roots of character equation are all inside unit circle.

An ARMA model with the notation ARMA(p,q) is an AR(p) model with q moving average (MA) terms—MA(q). The ARMA(p,q) model can be written asxt=c+εt+i=1pφixt-i+i=1qθiεt-i,where {θ1,,θq} are the parameters of the moving average. The error terms εt are assumed to be independent identically distributed (i.i.d.) random variables sampled from a normal distribution with zero mean and variance σ2. If this condition does not hold, the GARCH model in Section 1.3 provides a generalised alternative.

The parameters of an AR(p) model can be estimated by the recursive least-mean-square (LMS) method, which is a stochastic gradient descent optimisation2w(t)=w(t-1)+η(t)e(t)x(t),e(t)=x(t+1)-x(t)Tw,where η(t) is the step size or learning rate, often a small positive constant or monotonically decreasing in value.

Fig. 1 depicts the parameter estimation process for an AR(2) process using the LMS method. The parameters have been correctly estimated, as indicated by two solid curves (estimated parameters) in the upper figure. The dashed lines indicate the generating parameters. The horizontal axis defines the time steps or iterations. The middle plot shows the overall mean-squared-error (MSE) along the estimation process; and the lower plot is the squared error for each input.

The GARCH model [1] explicitly considers the variance of the current residual to be a linear function of the variances of the previous residuals. It has been widely applied to modelling financial time series, including FX rates, which exhibit different volatilities from time to time. It defines periods of high oscillation followed by periods of relative calm in a time series or vice versa.

A simple GARCH(θ,ρ) model can be expressed as follows:xt=φ0+i=1pφixt-i+εt,σt2=α0+i=1θαiεt-i2+j=1ρβjσt-j2,where εt is the error term with the assumption εt=σtvt. vt is i.i.d. with zero mean and unit variance. α0, αi, and βj are model parameters to be estimated and θ, ρ are pre-determined orders.

In the literature, simple GARCH model has been extended by applying additional restrictions, such as exponential GARCH by Nelson [23], quadratic GARCH model by Sentana [27] or by changing the assumption that vt has normal distribution, t-distribution, Cauchy distribution, etc. In this paper, only the commonly used standard GARCH model is used in the experiments.

The rest of this paper is organised as follows. Neural network approaches and SOM related time series modelling methods are discussed in Section 2. In Section 3, the proposed methodology is described. Section 4 presents applications of the proposed network for modelling and prediction of time series and FX rates. Finally, conclusions are given in Section 5.

Section snippets

Neural networks for FX modelling

The main difficulty in modelling financial time series is their nonstationarity. That is, the mean and variance of the time series are not constant but change over time. This implies that the variables switch their dynamics from time to time or have different modes in different periods. It is particularly true in FX rates due to the amount of inconstant information flow. Previous studies [8] show that the distribution of daily returns3 is approximately

Methodology

The problem of predicting future values of a stochastic process is closely related to the task of estimating the unknown parameters of a regressive model. The target process can be assumed to be generated by a number of stationary autoregressive processes. It has applications in many fields, especially in econometrics and automatic control. A number of studies recently focus on modelling such nonstationary processes using mixture models [33]. When a nonstationary process is considered as a

Experiments

In this section, experiments on both Mackey-Glass data and real-world FX rates are presented. The proposed method is used to characterise the dynamics of a nonlinear, nonstationary time series and to estimate underlying local regressive models.

Conclusions

A new approach to modelling nonstationarity of financial time series has been proposed by using a self-organising mixture autoregressive network. The network consists of a number of local autoregressive models that are organised and learnt in a self-organised fashion. An autocorrelation-based similarity is proposed and adopted as the fitness measure of local models to an input segment. It makes the network learning more effective and robust compared to the error-based and Euclidean

He Ni completed his PhD programme in August 2008 at the School of Electrical and Electronic Engineering, the University of Manchester. His research interests include financial time series forecasting, neural networks and machine learning. He obtained a BEng degree in Electronic Engineering from Xidian University and an M.Sc. degree in Automatic Control and System Engineering from the University of Sheffield in 2001 and 2003, respectively. He has started a lectureship post at Zhejiang Gongshang

References (33)

  • F.X. Diebold

    Empirical Modeling of Exchange Rate Dynamics

    (1988)
  • S. Haykin

    Neural Networks—A Comprehensive Foundation

    (1998)
  • V. Kodogiannis et al.

    Forecasting financial time series using neural network and fuzzy system-based techniques

    Neural Computing and Applications

    (2002)
  • T. Kohonen

    Self-Organizing Maps

    (1995)
  • T. Koskela, M. Varsta, J. Heikkonen, K. Kaski, Time series prediction using recurrent SOM with local linear models,...
  • H.W. Kuhn et al.

    Nonlinear Programming

  • Cited by (21)

    • A novel evolutionary-negative correlated mixture of experts model in tourism demand estimation

      2016, Computers in Human Behavior
      Citation Excerpt :

      Recently some mixture models have been proposed for time series estimation in the literature. The self-organizing mixture autoregressive (SOMAR) (Ni & Yin, 2009) and generalized SOMAR (GSOMAR) models (Yin & Ni, 2009) were presented to tackle nonlinear and nonstationary time series. These models contain a number of autoregressive models that are learnt and organized in a self-organized manner by the adaptive least mean square algorithm.

    • A neural gas mixture autoregressive network for modelling and forecasting FX time series

      2014, Neurocomputing
      Citation Excerpt :

      The autocorrelation is calculated on a small batch of the input. SOMAR can correctly detect and uncover underlying autoregressive models by employing SAC for evaluating the similarity [26]. The proposed method consists of two essential components, namely, neural gas and local mixture autoregressive, which are described in detail in the following two subsections.

    • SOMiMS - Topographic Mapping in the Model Space

      2021, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    View all citing articles on Scopus

    He Ni completed his PhD programme in August 2008 at the School of Electrical and Electronic Engineering, the University of Manchester. His research interests include financial time series forecasting, neural networks and machine learning. He obtained a BEng degree in Electronic Engineering from Xidian University and an M.Sc. degree in Automatic Control and System Engineering from the University of Sheffield in 2001 and 2003, respectively. He has started a lectureship post at Zhejiang Gongshang University since September 2008.

    Hujun Yin is a Senior Lecturer (Associate Professor) at The University of Manchester, School of Electrical and Electronic Engineering. He received BEng and M.Sc. degrees from Southeast University and PhD degree from University of York in 1983, 1986 and 1996, respectively. His research interests include neural networks, self-organising systems in particular, pattern recognition, bio-/neuro-informatics, and machine learning applications. He has studied, extended and applied the self-organising map (SOM) and related topics such as unsupervised learning, principal manifolds and data visualisation extensively in the past 10 years and proposed a number of extensions including Bayesian SOM and ViSOM, a principled data visualisation method. He has published over 100 peer-reviewed articles in a range of topics from density modelling, text mining and knowledge management, gene expression analysis and peptide sequencing, novelty detection, to financial time series modelling, and recently decoding neuronal responses.

    He is a senior member of the IEEE and a member of the UK EPSRC College. He is an Associate Editor of the IEEE Transactions on Neural Networks and a member of the Editorial Board of the International Journal of Neural Systems. He has served on the programme committee for more than 20 international conferences. He has been the Organising Chair, Programme Committee Chair, Steering Committee Member or Chair, Session Chair and General Chair for a number of conferences, such as International Workshop on Self-Organising Maps (WSOM), International Conference on Intelligent Data Engineering and Automated Learning (IDEAL), International Symposium on Neural Networks (ISNN), and International Joint Conference on Neural Networks (IJCNN). He has guest-edited a number of special issues on several leading international journals. He has received research funding from the EPSRC, BBSRC and DTI. He has also been a regular assessor for the EPSRC, the BBSRC, the Royal Society, the Hong Kong Research Grant Council, Netherlands Organisation for Scientific Research, and Slovakia Research and Development Council.

    View full text