Abstract
We investigate a number of Artificial Neural Network architectures (well-known and more “exotic”) in application to the long-term financial time-series forecasts of indexes on different global markets. The particular area of interest of this research is to examine the correlation of these indexes’ behaviour in terms of Machine Learning algorithms cross-training. Would training an algorithm on an index from one global market produce similar or even better accuracy when such a model is applied for predicting another index from a different market? The demonstrated predominately positive answer to this question is another argument in favour of the long-debated Efficient Market Hypothesis of Eugene Fama.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Beheim, L., Zitouni, A., Belloir, F., de la Housse, C.D.M.: New RBF neural network classifier with optimized hidden neurons number. WSEAS Trans. Syst. (2), 467–472 (2004)
Broomhead, D.S., Lowe, D.: Radial basis functions, multi-variable functional interpolation and adaptive networks. Technical reprt, Royal Signals and Radar Establishment Malvern (United Kingdom) (1988)
Eun, C.S., Shim, S.: International transmission of stock market movements. J. Financ. Quant. Anal. 24(2), 241–256 (1989)
Fama, E.F.: Efficient capital markets: a review of theory and empirical work. J. Financ. 25(2), 383–417 (1970)
Girosi, F., Poggio, T.: Representation properties of networks: Kolmogorov’s theorem is irrelevant. Neural Comput. 1(4), 465–469 (1989)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer Series in Statistics. Springer, New York (2001)
He, Q.Q., Pang, P.C.I., Si, Y.W.: Multi-source transfer learning with ensemble for financial time series forecasting. In: 2020 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT), pp. 227–233. IEEE (2020)
Ivakhnenko, A.G.: Polynomial theory of complex systems. IEEE Trans. Syst. Man Cybern. 4, 364–378 (1971)
Jakaite, L., Schetinin, V., Maple, C., et al.: Bayesian assessment of newborn brain maturity from two-channel sleep electroencephalograms. Comput. Math. Methods Med. 2012 (2012)
Jensen, M.C.: The performance of mutual funds in the period 1945–1964. J. Financ. 23(2), 389–416 (1968)
Kolmogorov, A.N.: On the representation of continuous functions of several variables by superpositions of continuous functions of a smaller number of variables. Am. Math. Soc. (1961)
Kurkin, S.A., Pitsik, E.N., Musatov, V.Y., Runnova, A.E., Hramov, A.E.: Artificial neural networks as a tool for recognition of movements by electroencephalograms. In: ICINCO (1), pp. 176–181 (2018)
Kůrková, V.: Kolmogorov’s Theorem is relevant. Neural Comput. 3(4), 617–622 (1991)
Nabipour, M., Nayyeri, P., Jabani, H., Mosavi, A., Salwana, E.: Deep learning for stock market prediction. Entropy 22(8), 840 (2020)
Nyah, N., Jakaite, L., Schetinin, V., Sant, P., Aggoun, A.: Evolving polynomial neural networks for detecting abnormal patterns. In: 2016 IEEE 8th International Conference on Intelligent Systems (IS), pp. 74–80. IEEE (2016)
Park, J., Sandberg, I.W.: Universal approximation using radial-basis-function networks. Neural Comput. 3(2), 246–257 (1991)
Pinkus, A.: Approximation theory of the MLP model in neural networks. Acta Numer. 8, 143–195 (1999)
Poterba, J.M., Summers, L.H.: Mean reversion in stock prices: evidence and implications. J. Financ. Econ. 22(1), 27–59 (1988)
Ren, S., Sun, J., He, K., Zhang, X.: Deep residual learning for image recognition. In: CVPR, vol. 2, p. 4 (2016)
Roll, R.: Orange juice and weather. Am. Econ. Rev. 74(5), 861–880 (1984)
Roll, R.: What every CFO should know about scientific progress in financial economics: what is known and what remains to be resolved. Financ. Manage. 23(2), 69–75 (1994)
Schetinin, V., Jakaite, L., Schult, J.: Informativeness of sleep cycle features in Bayesian assessment of newborn electroencephalographic maturation. In: 2011 24th International Symposium on Computer-Based Medical Systems (CBMS), pp. 1–6. IEEE (2011)
Selitskaya, N., et al.: Deep learning for biometric face recognition: experimental study on benchmark data sets. Deep Biomet. 71–97 (2020)
Selitskiy, S.: Kolmogorov’s gate non-linearity as a step toward much smaller artificial neural networks. In: Proceedings of the 24th International Conference on Enterprise Information Systems, vol. 1, pp. 492–499 (2022)
Selitskiy, S.: Elements of active continuous learning and uncertainty self-awareness: a narrow implementation for face and facial expression recognition. In: Goertzel, B., Iklé, M., Potapov, A., Ponomaryov, D. (eds.) AGI 2022. LNCS, vol. 13539, pp. 394–403. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-19907-3_38
Selitskiy, S., Christou, N., Selitskaya, N.: Using statistical and artificial neural networks meta-learning approaches for uncertainty isolation in face recognition by the established convolutional models. In: Nicosia, G., et al. (eds.) LOD 2021. LNCS, vol. 13164, pp. 338–352. Springer, Cham (2022). https://doi.org/10.1007/978-3-030-95470-3_26
Selitsky, S.: Hybrid convolutional-multilayer perceptron artificial neural network for person recognition by high gamma EEG features. Medicinskiy Vest. Severnogo Kavkaza 17(2), 192–196 (2022)
Sewell, M.: History of the efficient market hypothesis. Rn 11(04), 04 (2011)
Shleifer, A.: Inefficient Markets: An Introduction to Behavioural Finance. OUP, Oxford (2000)
Stoean, C., Paja, W., Stoean, R., Sandita, A.: Deep architectures for long-term stock price prediction with a heuristic-based strategy for trading simulations. PLoS One 14(10), e0223593 (2019)
Szegedy, C., et al.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–9 (2015)
Tsang, G., Deng, J., Xie, X.: Recurrent neural networks for financial time-series modelling. In: 2018 24th International Conference on Pattern Recognition (ICPR), pp. 892–897. IEEE (2018)
Venugopal, V., Baets, W.: Neural networks and statistical techniques in marketing research: a conceptual comparison. Mark. Intell. Plann. (1994)
Wickstrøm, K., Kampffmeyer, M., Mikalsen, K.Ø., Jenssen, R.: Mixing up contrastive learning: self-supervised representation learning for time series. Pattern Recogn. Lett. 155, 54–61 (2022)
Yen, G., Lee, C.F.: Efficient market hypothesis (EMH): past, present and future. Rev. Pac. Basin Financ. Mark. Policies 11(02), 305–329 (2008)
Zhang, G.P.: A neural network ensemble method with jittered training data for time series forecasting. Inf. Sci. 177(23), 5329–5346 (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Selitskiy, S. (2024). “It Looks All the Same to Me”: Cross-Index Training for Long-Term Financial Series Prediction. In: Nicosia, G., Ojha, V., La Malfa, E., La Malfa, G., Pardalos, P.M., Umeton, R. (eds) Machine Learning, Optimization, and Data Science. LOD 2023. Lecture Notes in Computer Science, vol 14505. Springer, Cham. https://doi.org/10.1007/978-3-031-53969-5_26
Download citation
DOI: https://doi.org/10.1007/978-3-031-53969-5_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-53968-8
Online ISBN: 978-3-031-53969-5
eBook Packages: Computer ScienceComputer Science (R0)