Conventional neural network training methods find a single set of values for network weights by minimizing an error function using a gradient descent-based technique. In contrast, the Bayesian approach infers the posterior distribution of weights, and makes predictions by averaging the predictions over a sample of networks, weighted by the posterior probability of the network given the data. The integrative nature of the Bayesian approach allows it to avoid many of the difficulties inherent in conventional approaches. This paper reports on the application of Bayesian MLP techniques to the problem of predicting the direction in the movement of the daily close value of the Australian All Ordinaries financial index. Predictions made over a 13 year out-of-sample period were tested against the null hypothesis that the mean accuracy of the model is no greater than the mean accuracy of a coin-flip procedure biased to take into account non-stationarity in the data. Results show that the null hypothesis can be rejected at the 0.005 level, and that the t-test p-values obtained using the Bayesian approach are smaller than those obtained using conventional MLP methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Y. Kajitani, A.I. McLeod, K.W. Hipel, Forecasting Nonlinear Time Series with Feed-forward Neural Networks: A Case Study of Canadian Lynx data, Journal of Forecasting, 24, 105–117 (2005).
J. Chung, Y. Hong, Model-Free Evaluation of Directional Predictability in Foreign Exchange Markets, Journal of Applied Econometrics, 22, 855–889 (2007).
P.F. Christoffersen, F.X. Diebold, Financial Asset Returns, Direction-of-Change Forecasting, and Volatility Dynamics, Penn Institute for Economic Research PIER Working Paper Archive 04-009, 2003.
S. Walczak, An Empirical Analysis of Data Requirements for Financial Forecasting with Neural Networks, Journal of Management Information Systems, 17(4), 203–222 (2001).
D.J.C. MacKay, A Practical Bayesian Framework for Back Propagation Networks, Neural Computation, 4(3), 448–472 (1992).
R.M. Neal, Bayesian Training of Backpropagation Networks by the Hybrid Monte Carlo Method, Department of Computer Science, University of Toronto Technical Report CRG-TR-92-1, 1992.
R.M. Neal, Bayesian Learning for Neural Networks (Springer-Verlag, New York, 1996).
A. Skabar, Application of Bayesian Techniques for MLPs to Financial Time Series Forecasting, Proceedings of 16th Australian Conference on Artificial Intelligence, 888–891 (2005).
C. Bishop, Neural Networks for Pattern Recognition (Oxford University Press, Oxford, 1995).
N.A. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A. Teller, and E. Teller, Equation of State Calculations by Fast Computing Machines, Journal of Chemical Physics, 21(6), 1087–1092 (1953).
S. Duane, A.D. Kennedy, B.J. Pendleton, and D. Roweth, Hybrid Monte Carlo, Physics Letters B, 195(2), 216–222 (1987).
S. Geman, G. Geman, Stochastic Relaxation, Gibbs Distributions and the Bayesian Restoration of Images, IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721–741 (1984).
M.F. Moller, A scaled conjugate gradient algorithm for fast supervised learning, Neural Networks, 6, 525–533 (1993).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer Science+Business Media B.V
About this chapter
Cite this chapter
Skabar, A.A. (2009). Direction-of-Change Financial Time Series Forecasting Using Neural Networks: A Bayesian Approach. In: Ao, SI., Gelman, L. (eds) Advances in Electrical Engineering and Computational Science. Lecture Notes in Electrical Engineering, vol 39. Springer, Dordrecht. https://doi.org/10.1007/978-90-481-2311-7_44
Download citation
DOI: https://doi.org/10.1007/978-90-481-2311-7_44
Publisher Name: Springer, Dordrecht
Print ISBN: 978-90-481-2310-0
Online ISBN: 978-90-481-2311-7
eBook Packages: EngineeringEngineering (R0)