Abstract
Forecasting financial market movement constitutes a complex and pivotal research area within the realm of Financial Technology (Fintech). In this work, we investigate the ability of large language models to predict Chinese overnight stock index movement, utilizing market summary gleaned from news media sources. We fine-tune various pre-trained models to compare the performance with that of Generative Pre-training Transformer (GPT) models, specifically GPT-3.5 and GPT-4, as provided by OpenAI. The empirical findings underscore that the fine-tuned pre-trained models, characterized by fewer parameters and more straightforward architectures, surpass the esteemed GPT-3.5 and GPT-4 models in predictive metrics of accuracy and f1. All fine-tuned models are publicly available on the huggingface platform (https://huggingface.co/hw2942).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Beltagy, I., Peters, M.E., Cohan, A.: Longformer: the long-document transformer. arXiv preprint arXiv:2004.05150 (2020)
Cui, Y., Che, W., Liu, T., Qin, B., Wang, S., Hu, G.: Revisiting pre-trained models for chinese natural language processing. arXiv preprint arXiv:2004.13922 (2020)
Cui, Y., Che, W., Wang, S., Liu, T.: Lert: a linguistically-motivated pre-trained language model. arXiv preprint arXiv:2211.05344 (2022)
Cui, Y., Yang, Z., Liu, T.: Pert: pre-training bert with permuted language model. arXiv preprint arXiv:2203.06906 (2022)
Gao, R., Zhang, X., Zhang, H., Zhao, Q., Wang, Y.: Forecasting the overnight return direction of stock market index combining global market indices: a multiple-branch deep learning approach. Expert Syst. Appl. 194, 116506 (2022)
Huang, A.H., Wang, H., Yang, Y.: FinBERT: a large language model for extracting information from financial text. Contemp. Account. Res. 40(2), 806–841 (2023)
Kenton, J.D.M.W.C., Toutanova, L.K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of naacL-HLT, vol. 1, p. 2 (2019)
Lewis, M., et al.: Bart: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. arXiv preprint arXiv:1910.13461 (2019)
Li, W., Bao, R., Harimoto, K., Chen, D., Xu, J., Su, Q.: Modeling the stock relation with graph network for overnight stock movement prediction. In: Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, pp. 4541–4547 (2021)
Liu, Y., et al.: Roberta: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
Lopez-Lira, A., Tang, Y.: Can Chatgpt forecast stock price movements? return predictability and large language models. arXiv preprint arXiv:2304.07619 (2023)
Lu, D., et al.: BBT-FIN: comprehensive construction of Chinese financial domain pre-trained language model, corpus and benchmark. arXiv preprint arXiv:2302.09432 (2023)
OpenAI: Gpt-4 technical report (2023)
Radford, A., Narasimhan, K., Salimans, T., Sutskever, I., et al.: Improving language understanding by generative pre-training (2018)
Shao, Y., et al.: CPT: a pre-trained unbalanced transformer for both Chinese language understanding and generation. arXiv preprint arXiv:2109.05729 (2021)
Wu, S., et al.: Bloomberggpt: a large language model for finance. arXiv preprint arXiv:2303.17564 (2023)
Xie, Q., Han, W., Lai, Y., Peng, M., Huang, J.: The wall street neophyte: a zero-shot analysis of chatgpt over multimodal stock movement prediction challenges. arXiv preprint arXiv:2304.05351 (2023)
Yang, H., Liu, X.Y., Wang, C.D.: FinGPT: open-source financial large language models. arXiv preprint arXiv:2306.06031 (2023)
Zaheer, M., et al.: Big bird: transformers for longer sequences. Adv. Neural. Inf. Process. Syst. 33, 17283–17297 (2020)
Zhang, J., et al.: Fengshenbang 1.0: being the foundation of Chinese cognitive intelligence. arXiv preprint arXiv:2209.02970 (2022)
Zhang, Z., et al.: Mengzi: towards lightweight yet ingenious pre-trained models for chinese. arXiv preprint arXiv:2110.06696 (2021)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Wang, H., Zhou, X. (2024). Forecasting Chinese Overnight Stock Index Movement Using Large Language Models with Market Summary. In: Tan, Y., Shi, Y. (eds) Data Mining and Big Data. DMBD 2023. Communications in Computer and Information Science, vol 2017. Springer, Singapore. https://doi.org/10.1007/978-981-97-0837-6_4
Download citation
DOI: https://doi.org/10.1007/978-981-97-0837-6_4
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-0836-9
Online ISBN: 978-981-97-0837-6
eBook Packages: Computer ScienceComputer Science (R0)