A self-organising mixture autoregressive network for FX time series modelling and prediction
Introduction
Foreign exchange (FX) or forex markets have grown rapidly in recent years and have become by far the largest financial market in the world with average daily trade in the global forex and related markets were reported to be over US$ 4 trillion in April 2007.1 FX rate forecasting has been an active and challenging area of research and has attracted a great deal of attention ever since the collapse of the Bretton–Woods system in 1973. The trend analysis in spot FX rates has been a recurrent theme among statisticians, econometricians and computer scientists. A fundamental approach is to use economic theories to underline the structural relations between exchange rates and other variables (e.g. interest rates and consumer price index) and to use statistical methods to identify the correlation between the past data and its future moves. Researchers have devoted a great deal of effort to developing various techniques in order to build a valid model. However, most econometric and time series techniques often fail to outdo the simplest random walk especially in short durations [21], [12]. The reason is that most of econometric models are linear and are used under specific or strict assumptions. For instance, autoregressive (AR) and autoregressive moving average (ARMA) models assume a linear relationship between the current value of the variables and the previous values and error terms. The mean and variance of the variables are regarded as constant over time. In other words, the process is assumed to be stationary. In practice, these conditions cannot be met.
In this paper, adaptive neural networks, in particular self-organising maps (SOM), are explored in modelling FX time series in conjunction with regressive models. The approach uses SOM to divide a time series into a number of homogeneous processes and these processes are then modelled by the nodes of an enhanced temporal SOM. The proposed network is termed self-organising mixture autoregressive (SOMAR) model, which uses AR models as components in the construction of the topological mixture model. A brief review on AR and related regressive models is given first as below.
The statistical AR model has been a primary tool in modelling time series, including financial time series [18]. The ARMA is the extended version of the AR model. Econometricians also use the generalised autoregressive conditional heteroskedastic (GARCH) [1] to further model the volatilities or variances of a financial time series.
The notation refers to the autoregressive model of order . The model is written aswhere are the parameters of the model, , is a concatenated input vector, is a constant and is white noise with zero mean and variance . The process is either random walk, when exhibits a unit root, or covariance-stationary if the roots of character equation are all inside unit circle.
An ARMA model with the notation is an model with moving average (MA) terms—. The model can be written aswhere are the parameters of the moving average. The error terms are assumed to be independent identically distributed (i.i.d.) random variables sampled from a normal distribution with zero mean and variance . If this condition does not hold, the GARCH model in Section 1.3 provides a generalised alternative.
The parameters of an model can be estimated by the recursive least-mean-square (LMS) method, which is a stochastic gradient descent optimisation2where is the step size or learning rate, often a small positive constant or monotonically decreasing in value.
Fig. 1 depicts the parameter estimation process for an AR(2) process using the LMS method. The parameters have been correctly estimated, as indicated by two solid curves (estimated parameters) in the upper figure. The dashed lines indicate the generating parameters. The horizontal axis defines the time steps or iterations. The middle plot shows the overall mean-squared-error (MSE) along the estimation process; and the lower plot is the squared error for each input.
The GARCH model [1] explicitly considers the variance of the current residual to be a linear function of the variances of the previous residuals. It has been widely applied to modelling financial time series, including FX rates, which exhibit different volatilities from time to time. It defines periods of high oscillation followed by periods of relative calm in a time series or vice versa.
A simple model can be expressed as follows:where is the error term with the assumption . is i.i.d. with zero mean and unit variance. , , and are model parameters to be estimated and , are pre-determined orders.
In the literature, simple GARCH model has been extended by applying additional restrictions, such as exponential GARCH by Nelson [23], quadratic GARCH model by Sentana [27] or by changing the assumption that has normal distribution, -distribution, Cauchy distribution, etc. In this paper, only the commonly used standard GARCH model is used in the experiments.
The rest of this paper is organised as follows. Neural network approaches and SOM related time series modelling methods are discussed in Section 2. In Section 3, the proposed methodology is described. Section 4 presents applications of the proposed network for modelling and prediction of time series and FX rates. Finally, conclusions are given in Section 5.
Section snippets
Neural networks for FX modelling
The main difficulty in modelling financial time series is their nonstationarity. That is, the mean and variance of the time series are not constant but change over time. This implies that the variables switch their dynamics from time to time or have different modes in different periods. It is particularly true in FX rates due to the amount of inconstant information flow. Previous studies [8] show that the distribution of daily returns3 is approximately
Methodology
The problem of predicting future values of a stochastic process is closely related to the task of estimating the unknown parameters of a regressive model. The target process can be assumed to be generated by a number of stationary autoregressive processes. It has applications in many fields, especially in econometrics and automatic control. A number of studies recently focus on modelling such nonstationary processes using mixture models [33]. When a nonstationary process is considered as a
Experiments
In this section, experiments on both Mackey-Glass data and real-world FX rates are presented. The proposed method is used to characterise the dynamics of a nonlinear, nonstationary time series and to estimate underlying local regressive models.
Conclusions
A new approach to modelling nonstationarity of financial time series has been proposed by using a self-organising mixture autoregressive network. The network consists of a number of local autoregressive models that are organised and learnt in a self-organised fashion. An autocorrelation-based similarity is proposed and adopted as the fitness measure of local models to an input segment. It makes the network learning more effective and robust compared to the error-based and Euclidean
He Ni completed his PhD programme in August 2008 at the School of Electrical and Electronic Engineering, the University of Manchester. His research interests include financial time series forecasting, neural networks and machine learning. He obtained a BEng degree in Electronic Engineering from Xidian University and an M.Sc. degree in Automatic Control and System Engineering from the University of Sheffield in 2001 and 2003, respectively. He has started a lectureship post at Zhejiang Gongshang
References (33)
Generalized autoregressive conditional heteroskedasticity
Journal of Econometrics
(1986)- et al.
Regression neural network for error correction in foreign exchange forecasting and trading
Computers and Operations Research
(2004) - et al.
The temporal Kohonen map
Neural Networks
(1993) - et al.
Much ado about nothing? Exchange rate forecasting: neural networks vs. linear models using monthly and weekly data
Neurocomputing
(1996) - et al.
Why is it so difficult to beat random walk forecast of exchange rates
Journal of International Economics
(2003) - et al.
Empirical exchange rate models of the seventies: do they fit out of sample?
Journal of International Economics
(1983) - et al.
Recursive self-organising maps
Neural Networks
(2002) Support vector machines experts for time series forecasting
Neurocomputing
(2002)A subordinate stochastic process model with finite variance for speculative price
Econometrica
(1973)- S. Dablemont, G. Simon, A. Lendasse, A. Ruttiens, F. Blayo, M. Verleysen, Time series forecasting with SOM and local...
Empirical Modeling of Exchange Rate Dynamics
Neural Networks—A Comprehensive Foundation
Forecasting financial time series using neural network and fuzzy system-based techniques
Neural Computing and Applications
Self-Organizing Maps
Nonlinear Programming
Cited by (21)
A novel evolutionary-negative correlated mixture of experts model in tourism demand estimation
2016, Computers in Human BehaviorCitation Excerpt :Recently some mixture models have been proposed for time series estimation in the literature. The self-organizing mixture autoregressive (SOMAR) (Ni & Yin, 2009) and generalized SOMAR (GSOMAR) models (Yin & Ni, 2009) were presented to tackle nonlinear and nonstationary time series. These models contain a number of autoregressive models that are learnt and organized in a self-organized manner by the adaptive least mean square algorithm.
A neural gas mixture autoregressive network for modelling and forecasting FX time series
2014, NeurocomputingCitation Excerpt :The autocorrelation is calculated on a small batch of the input. SOMAR can correctly detect and uncover underlying autoregressive models by employing SAC for evaluating the similarity [26]. The proposed method consists of two essential components, namely, neural gas and local mixture autoregressive, which are described in detail in the following two subsections.
Neural circuits for dynamics-based segmentation of time series
2022, Neural ComputationSOMiMS - Topographic Mapping in the Model Space
2021, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
He Ni completed his PhD programme in August 2008 at the School of Electrical and Electronic Engineering, the University of Manchester. His research interests include financial time series forecasting, neural networks and machine learning. He obtained a BEng degree in Electronic Engineering from Xidian University and an M.Sc. degree in Automatic Control and System Engineering from the University of Sheffield in 2001 and 2003, respectively. He has started a lectureship post at Zhejiang Gongshang University since September 2008.
Hujun Yin is a Senior Lecturer (Associate Professor) at The University of Manchester, School of Electrical and Electronic Engineering. He received BEng and M.Sc. degrees from Southeast University and PhD degree from University of York in 1983, 1986 and 1996, respectively. His research interests include neural networks, self-organising systems in particular, pattern recognition, bio-/neuro-informatics, and machine learning applications. He has studied, extended and applied the self-organising map (SOM) and related topics such as unsupervised learning, principal manifolds and data visualisation extensively in the past 10 years and proposed a number of extensions including Bayesian SOM and ViSOM, a principled data visualisation method. He has published over 100 peer-reviewed articles in a range of topics from density modelling, text mining and knowledge management, gene expression analysis and peptide sequencing, novelty detection, to financial time series modelling, and recently decoding neuronal responses.
He is a senior member of the IEEE and a member of the UK EPSRC College. He is an Associate Editor of the IEEE Transactions on Neural Networks and a member of the Editorial Board of the International Journal of Neural Systems. He has served on the programme committee for more than 20 international conferences. He has been the Organising Chair, Programme Committee Chair, Steering Committee Member or Chair, Session Chair and General Chair for a number of conferences, such as International Workshop on Self-Organising Maps (WSOM), International Conference on Intelligent Data Engineering and Automated Learning (IDEAL), International Symposium on Neural Networks (ISNN), and International Joint Conference on Neural Networks (IJCNN). He has guest-edited a number of special issues on several leading international journals. He has received research funding from the EPSRC, BBSRC and DTI. He has also been a regular assessor for the EPSRC, the BBSRC, the Royal Society, the Hong Kong Research Grant Council, Netherlands Organisation for Scientific Research, and Slovakia Research and Development Council.