skip to main content
10.1145/3604078.3604095acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicdipConference Proceedingsconference-collections
research-article

A Method of Rainfall-Runoff Prediction Based on Transformer

Published: 26 October 2023 Publication History

Abstract

Accurately predicting rainfall runoff is very beneficial for flood forecasting, water resource management, and planning. Due to the nonlinear time series nature of this problem, it remains a challenging problem in hydrological information processing. However, some methods fail to provide better forecasts for the rainy and dry seasons in certain regions. To address this issue, this paper proposes a rainfall-runoff prediction model called P-former, which is based on two kinds of positional encodings: Feature Positional Encoding (FPE) and Temporal Positional Encoding (TPE). FPE and TPE capture the correlation between meteorological data characteristics and the periodicity of time information, thereby improving the model's ability to predict future rainy and dry seasons, sensing the range of runoff changes, and ultimately improve its runoff prediction ability. Experimental results demonstrate that the P-former model decreases the Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE) scores of the single-day average runoff forecasts in the rainy season by 48%-53% and 11%-19%, respectively, in terms of both performance and accuracy.

References

[1]
Hui Liu, Xi wei Mi, and Yan fei Li. 2018. Wind speed forecasting method based on deep learning strategy using empirical wavelet transform, long short term memory neural network and Elman neural network. Energy Conversion and Management 156 (2018), 498–514. https://doi.org/ 10.1016/j.enconman.2017.11.053.
[2]
Khabat Khosravi, Binh Thai Pham, Kamran Chapi, Ataollah Shirzadi, Himan Shahabi, Inge Revhaug, Indra Prakash, and Dieu Tien Bui. 2018. A comparative assessment of decision trees algorithms for flash flood susceptibility modeling at Haraz watershed, northern Iran. Science of the Total Environment 627 (2018), 744–755.
[3]
Thomas Lees, Marcus Buechel, Bailey Anderson, Louise Slater, Steven Reece, Gemma Coxon, and Simon J Dadson. 2021. Benchmarking Data-Driven Rainfall-Runoff Models in Great Britain: A comparison of LSTM-based models with four lumped conceptual models. Hydrology and Earth System Sciences 25, 10 (2021).
[4]
Halit Apaydin, Hajar Feizi, Mohammad Taghi Sattari, Muslume Sevba Colak, Shahaboddin Shamshirband, and Kwok-Wing Chau. 2020. Comparative analysis of recurrent neural network architectures for reservoir inflow forecasting. Water 12, 5 (2020), 1500.
[5]
Dongxin Bai, Guangyin Lu, Ziqiang Zhu, Jingtian Tang, Ji Fang, and Aixiang Wen. 2022. Using time series analysis and dual-stage attention-based recurrent neural network to predict landslide displacement. Environmental Earth Sciences 81, 21 (2022), 509.
[6]
R Mohammadi Farsani and Ehsan Pazouki. 2020. A transformer selfattention model for time series forecasting. Journal of Electrical and Computer Engineering Innovations (JECEI) 9, 1 (2020), 1–10.
[7]
Haoyi Zhou, Shanghang Zhang, Jieqi Peng, Shuai Zhang, Jianxin Li, Hui Xiong, and Wancai Zhang. 2021. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence, Vol. 35. 11106–11115.
[8]
Sho Ooi, Tsuyoshi Ikegaya, Mutsuo Sano, Hajime Tabuchi, Fumie Saito, and Satoshi Umeda. 2017. Attention Behavior Evaluation during Daily Living based on Egocentric Vision. Journal of Advances in Information Technology Vol 8, 2 (2017).
[9]
Yao Qin, Dongjin Song, Haifeng Chen, Wei Cheng, Guofei Jiang, and Garrison Cottrell. 2017. A dual-stage attention-based recurrent neural network for time series prediction. arXiv preprint arXiv:1704.02971 (2017).
[10]
Bilal Abdualgalil, Sajimon Abraham, and Waleed M Ismael. 2022. COVID19 Infection Prediction Using Efficient Machine Learning Techniques Based on Clinical Data. Journal of Advances in Information Technology Vol 13, 5 (2022).
[11]
Chen Liang, Hongqing Li, Mingjun Lei, and Qingyun Du. 2018. Dongting lake water level forecast and its relationship with the three gorges dam based on a long short-term memory network. Water 10, 10 (2018), 1389.
[12]
Shun-Yao Shih, Fan-Keng Sun, and Hung-yi Lee. 2019. Temporal pattern attention for multivariate time series forecasting. Machine Learning 108 (2019), 1421–1441.
[13]
Angelos Katharopoulos, Apoorv Vyas, Nikolaos Pappas, and Fran¸cois Fleuret. 2020. Transformers are rnns: Fast autoregressive transformers with linear attention. In International Conference on Machine Learning. PMLR, 5156–5165.
[14]
Shengqiang Li, Menglong Xu, and Xiao-Lei Zhang. 2021. Conformer-based End-to-end Speech Recognition With Rotary Position Embedding. In 2021 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC). IEEE, 443–447.
[15]
Vaswani, Ashish, Shazeer, Noam, Parmar, Niki, Jakob Uszkoreit, Jones, Llion, Gomez, Aidan N., Kaiser, Lukasz, and Illia Polosukhin. 2017. Attention is All You Need. In Proceedings of the 31st International Conference on Neural Information Processing Systems. 5998–6008.
[16]
Zhuoran Shen, Mingyuan Zhang, Haiyu Zhao, Shuai Yi, and Hongsheng Li. 2021. Efficient attention: Attention with linear complexities. In Proceedings of the IEEE/CVF winter conference on applications of computer vision. 3531–3539.

Index Terms

  1. A Method of Rainfall-Runoff Prediction Based on Transformer

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICDIP '23: Proceedings of the 15th International Conference on Digital Image Processing
    May 2023
    711 pages
    ISBN:9798400708237
    DOI:10.1145/3604078
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 October 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Attention
    2. LSTM
    3. Rainfall
    4. Runoff
    5. Transformer

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    ICDIP 2023

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 54
      Total Downloads
    • Downloads (Last 12 months)39
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 15 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media