Abstract
This work presents an investigation into the ability of recurrent neural networks (RNNs) to provide long term predictions of time series data generated by coal fired power plants. While there are numerous studies which have used artificial neural networks (ANNs) to predict coal plant parameters, to the authors’ knowledge these have almost entirely been restricted to predicting values at the next time step, and not farther into the future. Using a novel neuro-evolution strategy called Evolutionary eXploration of Augmenting Memory Models (EXAMM), we evolved RNNs with advanced memory cells to predict per-minute plant parameters and per-hour boiler parameters up to 8 hours into the future. These data sets were challenging prediction tasks as they involve spiking behavior in the parameters being predicted. While the evolved RNNs were able to successfully predict the spikes in the hourly data they did not perform very well in accurately predicting their severity. The per-minute data proved even more challenging as medium range predictions miscalculated the beginning and ending of spikes, and longer range predictions reverted to long term trends and ignored the spikes entirely. We hope this initial study will motivate further study into this highly challenging prediction problem. The use of fuel properties data generated by a new Coal Tracker Optimization (CTO) program was also investigated and this work shows that their use improved predictive ability of the evolved RNNs.
Zusammenfassung
Diese Arbeit untersucht die Fähigkeit von rekurrenten neuronalen Netzen (RNN) zur Langzeitvorhersage von Zeitreihendaten, die in Kohlekraftwerken anfallen. Zwar gibt es zahlreiche Studien, die künstliche neuronale Netze (KNN) verwenden, um Vorhersagen von Kohlekraftwerksparametern zu erreichen, aber nach Kenntnis der Autoren sind diese fast ausschließlich auf die Vorhersage von Werten im nächsten Zeitschritt beschränkt und gehen nicht weiter darüber hinaus in die Zukunft. Durch Verwendung eines neuartigen Neuro-Evolutions-Ansatzes, den wir Evolutionary eXploration von Augmenting Memory Models (EXAMM) nennen, wurde ein RNN mit erweiterten Speicherzellen zur Vorhersage von minutenweisen Anlagenparametern und stundenweisen Kesselparametern entwickelt, die bis zu 8 Stunden in die Zukunft reicht. Diese Datensätze stellen herausfordernde Prognoseaufgaben dar, da sie Spitzenverhalten in den vorherzusagenden Parametern beinhalten. Während die evolvierten RNNs die Spitzen in den stundenweisen Daten gut vorhersagen konnten, war die Vorhersage des Ausmaßes nicht genau möglich. Die minutenweisen Daten erwiesen sich als noch schwieriger, da die mittelfristigen Vorhersagen den Beginn und das Ende von Spitzen falsch berechneten und die längerfristigen Vorhersagen Langzeittrends verwechselten oder die Spitzen vernachlässigten. Wir hoffen, dass diese erste Studie weitere Studien zu diesem äußerst herausfordernden Vorhersageproblem motivieren wird. Ebenfalls untersucht wurde die Verwendung von Brennstoffdaten, die durch ein neuartiges Programm, genannt Coal Tracker Optimization (CTO), erzeugt wurden. Unser Beitrag zeigt, dass dadurch die Vorhersagefähigkeit der evolvierten RNNs verbessert wurde.
Funding source: U.S. Department of Energy
Award Identifier / Grant number: FE0031547
Funding statement: This material is in part supported by the U.S. Department of Energy, Office of Science, Office of Advanced Combustion Systems under Award Number #FE0031547.
About the authors

Dr. Desell is an Associate Professor specializing in Data Science. His research focuses on the application of machine learning to large-scale, real world data sets using high performance and distributed computing, with an emphasis on developing systems for practical scientific use. He is particularly interested in the intersection of evolutionary algorithms and neural networks, or ‘neuro-evolution’, where evolutionary algorithms are used to automate and optimize the design of neural network architectures.

Mr. ElSaid is a Computer and Information Sciences Ph.D. candidate at the Rochester Institute of Technology. He finished his B.Sc. Degree from Cairo University in Aerospace Engineering and his Master degree in computer science from the University of North Dakota. His research focuses on metaheuristics, natural inspired optimization techniques, and computing-based solutions for engineering problems.

Ms. Lyu received B. Eng. degree in Automation from Nanjing Agricultural University, Nanjing, China, in 2016, and the M.S. degree in Computer Science from Syracuse University, Syracuse, NY, in 2018. She is currently pursuing the Ph.D. degree in Computer Science at Rochester Institute of Technology, Rochester, NY. Her current research interests include neuroevolution, computer vision, and machine learning.

Mr. Stadem specializes in advanced analytical techniques as well as algorithm/method development and optimization. He utilizes MTI’s expertise in ash-related phenomena to model the impact of fuel properties on emissions and efficiency in energy systems. He is interested in integrating physical and chemical relationships with laboratory analysis data and real-time operations data to generate accurate and robust hybrid models.

Ms. Shuchita Patwardhan’s principal areas of expertise are managing impacts of fuel properties on fireside slagging and fouling in combustion and gasification systems, analysis of power plant operations data and computer modeling for energy systems. She currently manages and works on projects aimed at improving efficiency, reliability and flexibility of power plants by developing neural network augmented tools. She has also administered field work to generate data that has helped in development of computer-based models to forecast the fuel quality and power plant performance.

Dr. Benson, President of Microbeam Technologies Incorporated, specializes in the development of tools to predict the impact fuel properties on large-scale energy conversion system design and performance. He is specifically interested in the integration of phenomenological, computational fluid dynamics, and neural network methods to improve energy conversion system efficiency and reliability.
Acknowledgment
We thank Microbeam Technologies, Inc., for their help in collecting and preparing the coal-fired power plant data.
Appendix A Burner parameters
Conditioner Inlet Temp
Conditioner Outlet Temp
Coal Feeder Rate
Primary Air Flow
Primary Air Split
System Secondary Air Flow Total
Secondary Air Flow
Secondary Air Split
Tertiary Air Split
Total Combined Air Flow
Supplementary Fuel Flow
Main Flame Intensity
Appendix B Fuel properties parameters
Base Acid Ratio
Ash Content
Na (Sodium) Content
Fe (Iron) Content
BTU
Ash Flow
Na (Sodium) Flow
Fe (Iron) Flow
Appendix C Boiler parameters
Cold Reheat Steam Temperature
ECON Flue Gas Out Press
ECON HDR 01 Out Temp
ECON HDR 02 Out Temp
ECON In Gas Temp
Economizer Differential
Economizer Gas Outlet O2 Level
Economizer Inlet Feedwater Flow
Economizer Inlet Feedwater Temperature
Economizer Outlet Avg
Economizer Gas Recirc Outlet Temperature
Fuel Cost For 1 Btu/Kwh Heat Rate Deviation
Gross Generator Output
Hot Reheat Temperature (Reheater Outlet)
Main Steam Pressure At Boiler
Main Steam Temp (Superheater Outlet)
Main Steam Press
Main Steam Spray Flow
Main Steam Spray Press
NOx Master Out
Net Plant Heat Rate
Net Unit Generation
Nose Gas Temperature
Prim Suphtr Differential
PSH Gas Outlet Temperature
PSH Outlet Avg
PSH Superheater Gas Inlet Temperature
RH Suphtr Bank-1 Diff
RH Suphtr Bank-2 Diff
Sec SH Inlet Temp Avg
Sec SH Outlet Temp Avg
SSH Inlet HDR 01 Temp
SSH Inlet HDR 04 Temp
SSH Out HDR TC 01 Temp
SSH Out HDR TC 02 Temp
SSH Out HDR TC 03 Temp
SSH Out HDR TC 04 Temp
SSH Outlet HDR TC 05 Temp
Total OFA Air Flow
Water Wall Raw Cleanliness
Avg Conditioner Inlet Temp
Avg Conditioner Outlet Temp
Total Lignite Feeder Rate
Total Primary Air Flow
Avg Primary Air Split
Total Secondary Air Flow
Avg Secondary Air Split
Total Tertiary Air Flow
Avg Tertiary Air Split
Total Combined Air Flow
Total Main Oil Flow
Avg Main Flame Intensity
Time Until Next Shutdown
References
1. Michalis Mavrovouniotis and Shengxiang Yang. Evolving neural networks using ant colony optimization with pheromone trail limits. In 2013 13th UK Workshop on Computational Intelligence (UKCI), pages 16–23. IEEE, 2013.10.1109/UKCI.2013.6651282Search in Google Scholar
2. J Smrekar, D Pandit, Magnus Fast, Mohsen Assadi and Sudipta De. Prediction of power output of a coal-fired power plant by artificial neural network. Neural Computing and Applications, 19(5):725–740, 2010.10.1007/s00521-009-0331-6Search in Google Scholar
3. Amrita Kumari, SK Das and PK Srivastava. Modeling fireside corrosion rate in a coal fired boiler using adaptive neural network formalism. Portugaliae Electrochimica Acta, 34(1):23–38, 2016.10.4152/pea.201601023Search in Google Scholar
4. Hao Zhou, Kefa Cen and Jianren Fan. Modeling and optimization of the NOx emission characteristics of a tangentially fired boiler with artificial neural networks. Energy, 29(1):167–183, 2004.10.1016/j.energy.2003.08.004Search in Google Scholar
5. Fang Wang, Suxia Ma, He Wang, Yaodong Li and Junjie Zhang. Prediction of NOx emission for coal-fired boilers based on deep belief network. Control Engineering Practice, 80:26–35, 2018.10.1016/j.conengprac.2018.08.003Search in Google Scholar
6. Zhou Hao, Cen Kefa and Mao Jianbo. Combining neural network and genetic algorithms to optimize low no x pulverized coal combustion. Fuel, 80(15):2163–2169, 2001.10.1016/S0016-2361(01)00104-1Search in Google Scholar
7. Jiyu Chen, Feng Hong, Mingming Gao, Taihua Chang and Liying Xu. Prediction model of scr outlet NOx based on LSTM algorithm. In Proceedings of the 2019 2nd International Conference on Intelligent Science and Technology, pages 7–10. ACM, 2019.10.1145/3354142.3354144Search in Google Scholar
8. Peng Tan, Biao He, Cheng Zhang, Debei Rao, Shengnan Li, Qingyan Fang and Gang Chen. Dynamic modeling of NOx emission in a 660 MW coal-fired boiler with long short-term memory. Energy, 176:429–436, 2019.10.1016/j.energy.2019.04.020Search in Google Scholar
9. Seyed Mostafa Safdarnejad, Jake F Tuttle and Kody M Powell. Dynamic modeling and optimization of a coal-fired utility boiler to forecast and minimize NOx and CO emissions simultaneously. Computers & Chemical Engineering, 124:62–79, 2019.10.1016/j.compchemeng.2019.02.001Search in Google Scholar
10. Cem Onat and Mahmut Daskin. A basic ann system for prediction of excess air coefficient on coal burners equipped with a ccd camera. Mathematics and Statistics, 7(1):1–9, 2019.10.13189/ms.2019.070101Search in Google Scholar
11. Peter J Brockwell, Richard A Davis and Stephen E Fienberg. Time Series: Theory and Methods: Theory and Methods. Springer Science & Business Media, 1991.10.1007/978-1-4419-0320-4Search in Google Scholar
12. Khalid Salama and Ashraf M Abdelbar. A novel ant colony algorithm for building neural network topologies. In Swarm Intelligence, pages 1–12. Springer, 2014.10.1007/978-3-319-09952-1_1Search in Google Scholar
13. Masanori Suganuma, Shinichi Shirakawa and Tomoharu Nagao. A genetic programming approach to designing convolutional neural network architectures. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO’17, pages 497–504. ACM, New York, NY, USA, 2017.10.1145/3071178.3071229Search in Google Scholar
14. Yanan Sun, Bing Xue and Mengjie Zhang. Evolving deep convolutional neural networks for image classification. CoRR, arXiv:1710.10741, 2017.Search in Google Scholar
15. Risto Miikkulainen, Jason Liang, Elliot Meyerson, Aditya Rawal, Dan Fink, Olivier Francon, Bala Raju, Hormoz Shahrzad, Arshak Navruzyan, Nigel Duffy and Babak Hodjat. Evolving deep neural networks. arXiv preprint arXiv:1703.00548, 2017.Search in Google Scholar
16. Kenneth Stanley and Risto Miikkulainen. Evolving neural networks through augmenting topologies. Evolutionary computation, 10(2):99–127, 2002.10.1162/106365602320169811Search in Google Scholar PubMed
17. Kenneth O Stanley, David B D’Ambrosio and Jason Gauci. A hypercube-based encoding for evolving large-scale neural networks. Artificial life, 15(2):185–212, 2009.10.1162/artl.2009.15.2.15202Search in Google Scholar PubMed
18. Aditya Rawal and Risto Miikkulainen. Evolving deep LSTM-based memory networks using an information maximization objective. In Proceedings of the Genetic and Evolutionary Computation Conference 2016, pages 501–508. ACM, 2016.10.1145/2908812.2908941Search in Google Scholar
19. Aditya Rawal and Risto Miikkulainen. From nodes to networks: Evolving recurrent neural networks. CoRR, arXiv:1803.04439, 2018.Search in Google Scholar
20. Andrés Camero, Jamal Toutouh and Enrique Alba. Low-cost recurrent neural network expected performance evaluation. arXiv preprint arXiv:1805.07159, 2018.Search in Google Scholar
21. Andrés Camero, Jamal Toutouh and Enrique Alba. A specialized evolutionary strategy using mean absolute error random sampling to design recurrent neural networks. arXiv preprint arXiv:1909.02425, 2019.Search in Google Scholar
22. Travis Desell, Sophine Clachar, James Higgins and Brandon Wild. Evolving deep recurrent neural networks using ant colony optimization. In European Conference on Evolutionary Computation in Combinatorial Optimization, pages 86–98. Springer, 2015.10.1007/978-3-319-16468-7_8Search in Google Scholar
23. AbdElRahman ElSaid, Fatima El Jamiy, James Higgins, Brandon Wild and Travis Desell. Optimizing long short-term memory recurrent neural networks using ant colony optimization to predict turbine engine vibration. Applied Soft Computing, 2018.10.1016/j.asoc.2018.09.013Search in Google Scholar
24. Travis J. Desell, AbdElRahman A. ElSaid and Alexander G. Ororbia. An empirical exploration of deep recurrent connections and memory cells using neuro-evolution, 2019.10.1007/978-981-15-3685-4_10Search in Google Scholar
25. AbdElRahman ElSaid, Steven Benson, Shuchita Patwardhan, David Stadem and Desell Travis. Evolving recurrent neural networks for time series data prediction of coal plant parameters. In The 22nd International Conference on the Applications of Evolutionary Computation, Leipzig, Germany, April 2019.10.1007/978-3-030-16692-2_33Search in Google Scholar
26. Alexander Ororbia, AbdElRahman ElSaid and Travis Desell. Investigating recurrent neural network memory structures using neuro-evolution. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO’19, pages 446–455. ACM, New York, NY, USA, 2019.10.1145/3321707.3321795Search in Google Scholar
27. Shuchita Patwardhan, David Stadem, Matt Fuka and Steve Benson. Condition Based Monitoring and Predicting Ash Behavior in Coal Fired Boilers – II – Coal Properties Optimization. Paper Presented at Clearwater Clean Energy Conference, June 2019.Search in Google Scholar
28. David Stadem, Shuchita Patwardhan, Matt Fuka, James Langfeld, Alek Benson and Steven Benson. Improving Coal Fired Plant Performance Using a Coal Tracker Optimization Tool. Paper Presented at Pittsburgh Coal Conference, September 2019.Search in Google Scholar
© 2020 Walter de Gruyter GmbH, Berlin/Boston