Abstract
Analysing historical data from turbines can identify faults and even predict potential failures. However, most articles focus solely on one single unit, overlooking the historical operational patterns of turbines from other units, which serve as important references for human experts. We propose ExpertAP, an approach that first learns the running patterns of turbines from historical data of different units and then performs anomaly prediction for the target turbine. This approach faces two key challenges: The scarcity of high-quality anomaly labels and the difficulty in using historical anomaly labels. Regarding the first challenge, we thus introduce a semi-supervised backbone which is pre-trained with the task of sequence reconstruction using data from multiple units and fine-tuned with the task of anomaly prediction using data from the target unit. We also propose a novel two-dimensional selection strategy: Filtering out anomaly labels in the dimension of the variable during pre-training and filtering out redundant normal time-series sequences in the dimension of time during fine-tuning. Regarding the second challenge, the anomaly labels are modelled as binary time series, which significantly differ from the sensor data generated from continuous sampling. Therefore, we design different embedding layers for the anomaly labels and sensor data. These layers are trained during fine-tuning to align the two types of input data in the latent space, allowing us to utilize historical anomaly labels as a basis for anomaly prediction. The proposed framework was tested on real data collected from the Turbine Supervision Instrumentation (TSI) system, showing promising test results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In practice, to avoid the model tending to classify all sequences as potentially anomalous, we perform time-dimensional selection over each batch of windows instead of each window. We only dismiss the batch with low anomalies rate and thus allow windows without anomalies as input.
References
Qu, F., Liu, J., Zhu, H., et al.: Wind turbine condition monitoring based on assembled multidimensional membership functions using fuzzy inference system[J]. IEEE Trans. Industr. Inf. 16(6), 4028–4037 (2019)
Gu, H., Si, F., Cui, Y., et al.: Information entropy theory for steam turbine system monitoring study. Eng. Rep. 2(11), e12261 (2020)
Zhao, X., Ru, D., Wang, P., et al.: Fatigue life prediction of a supercritical steam turbine rotor based on neural networks. Eng. Fail. Anal. 127, 105435 (2021)
Quintanar-Gago, D.A., Nelson, P.F., Díaz-Sánchez, Á., et al.: Assessment of steam turbine blade failure and damage mechanisms using a Bayesian network. Reliabil. Eng. Syst. Saf. 207, 107329 (2021)
Chen, Z., Zhou, D., Zio, E., et al.: Adaptive transfer learning for multimode process monitoring and unsupervised anomaly detection in steam turbines. Reliabil. Eng. Syst. Saf. 234, 109162 (2023)
Schlechtingen M, Santos I F, Achiche S. Wind turbine condition monitoring based on SCADA data using normal behavior models. Part 1: System description[J]. Applied Soft Computing, 2013, 13(1): 259-270
Chen, J., Li, J., Chen, W., et al.: Anomaly detection for wind turbines based on the reconstruction of condition parameters using stacked denoising autoencoders[J]. Renewable Energy 147, 1469–1480 (2020)
Renström, N., Bangalore, P., Highcock, E.: System-wide anomaly detection in wind turbines using deep autoencoders[J]. Renewable Energy 157, 647–659 (2020)
Chen, H., Liu, H., Chu, X., et al.: Anomaly detection and critical SCADA parameters identification for wind turbines based on LSTM-AE neural network[J]. Renewable Energy 172, 829–840 (2021)
Zhang, C., Hu, D., Yang, T.: Anomaly detection and diagnosis for wind turbines using long short-term memory-based stacked denoising autoencoders and XGBoost[J]. Reliability Engineering & System Safety 222, 108445 (2022)
Mahalakshmi G, Sridevi S, Rajaram S. A survey on forecasting of time series data[C]//2016 international conference on computing technologies and intelligent data engineering (ICCTIDE’16). IEEE, 2016: 1-8
Van Den Oord A, Dieleman S, Zen H, et al. Wavenet: A generative model for raw audio[J]. arXiv preprint arXiv:1609.03499, 2016, 12
Bai S, Kolter J Z, Koltun V. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling[J]. arXiv preprint arXiv:1803.01271, 2018
Borovykh A, Bohte S, Oosterlee C W. Conditional time series forecasting with convolutional neural networks[J]. arXiv preprint arXiv:1703.04691, 2017
Salinas, D., Flunkert, V., Gasthaus, J., et al.: DeepAR: Probabilistic forecasting with autoregressive recurrent networks[J]. Int. J. Forecast. 36(3), 1181–1191 (2020)
Rangapuram S S, Seeger M W, Gasthaus J, et al. Deep state space models for time series forecasting[J]. Advances in neural information processing systems, 2018, 31
Lim B, Zohren S, Roberts S. Recurrent neural filters: Learning independent bayesian filtering steps for time series prediction[C]//2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020: 1-8
Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30
Zhou H, Zhang S, Peng J, et al. Informer: Beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI conference on artificial intelligence. 2021, 35(12): 11106-11115
Wu, H., Xu, J., Wang, J., et al.: Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting[J]. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)
Liu S, Yu H, Liao C, et al. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting[C]//International conference on learning representations. 2021
Nie, Y., Nguyen, N.H., Sinthong, P., et al.: A time series is worth 64 words: long-term forecasting with transformers. arXiv preprint arXiv:2211.14730 (2022)
Choi, K., Yi, J., Park, C., et al.: Deep learning for anomaly detection in time-series data: review, analysis, and guidelines. IEEE Access 9, 120043–120065 (2021)
Manevitz, L.M., Yousef, M.: One-class SVMs for document classification. J. Mach. Learn. Res. 2(Dec), 139–154 (2001)
Box, G.E.P., Pierce, D.A.: Distribution of residual autocorrelations in autoregressive-integrated moving average time series models. J. Am. Stat. Assoc. 65(332), 1509–1526 (1970)
Munir, M., Siddiqui, S.A., Dengel, A., et al.: DeepAnT: a deep learning approach for unsupervised anomaly detection in time series. IEEE Access 7, 1991–2005 (2018)
Xing, Z., Pei, J., Dong, G., et al.: Mining sequence classifiers for early prediction. In: Proceedings of the 2008 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics, pp. 644–655 (2008)
Xing, Z., Pei, J., Philip, S.Y.: Early prediction on time series: a nearest neighbor approach. In: IJCAI, pp. 1297–1302 (2009)
Ulanova, L., Yan, T., Chen, H., et al.: Efficient long-term degradation profiling in time series for complex physical systems. In: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 2167–2176 (2015)
Yin, X.X., Miao, Y., Zhang, Y.: Time series based data explorer and stream analysis for anomaly prediction. Wireless Commun. Mob. Comput. 2022, 5885904 (2022)
Ning, Z., Jiang, Z., Miao, H., et al.: MST-GNN: a multi-scale temporal-enhanced graph neural network for anomaly detection in multivariate time series. In: Li, B., Yue, L., Tao, C., Han, X., Calvanese, D., Amagasa, T. (ed.s) Asia-Pacific Web (APWeb) and Web-Age Information Management (WAIM) Joint International Conference on Web and Big Data, pp. 382–390. Springer Nature Switzerland, Cham (2022). https://doi.org/10.1007/978-3-031-25158-0_29
Xu, J., Wu, H., Wang, J., et al.: Anomaly transformer: Time series anomaly detection with association discrepancy. arXiv preprint arXiv:2110.02642 (2021)
Yun, C., Bhojanapalli, S., Rawat, A.S., et al.: Are transformers universal approximators of sequence-to-sequence functions?. arXiv preprint arXiv:1912.10077 (2019)
Devlin, J., Chang, M.W., Lee, K., et al.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Liang, Y. et al. (2025). ExpertAP: Leveraging Multi-unit Operational Patterns for Advanced Turbine Anomaly Prediction. In: Lan, X., Mei, X., Jiang, C., Zhao, F., Tian, Z. (eds) Intelligent Robotics and Applications. ICIRA 2024. Lecture Notes in Computer Science(), vol 15208. Springer, Singapore. https://doi.org/10.1007/978-981-96-0783-9_24
Download citation
DOI: https://doi.org/10.1007/978-981-96-0783-9_24
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-96-0782-2
Online ISBN: 978-981-96-0783-9
eBook Packages: Computer ScienceComputer Science (R0)