Skip to main content
Log in

An ensemble of neural networks for weather forecasting

  • Original Article
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

This study presents the applicability of an ensemble of artificial neural networks (ANNs) and learning paradigms for weather forecasting in southern Saskatchewan, Canada. The proposed ensemble method for weather forecasting has advantages over other techniques like linear combination. Generally, the output of an ensemble is a weighted sum, which are weight-fixed, with the weights being determined from the training or validation data. In the proposed approach, weights are determined dynamically from the respective certainties of the network outputs. The more certain a network seems to be of its decision, the higher the weight. The proposed ensemble model performance is contrasted with multi-layered perceptron network (MLPN), Elman recurrent neural network (ERNN), radial basis function network (RBFN), Hopfield model (HFM) predictive models and regression techniques. The data of temperature, wind speed and relative humidity are used to train and test the different models. With each model, 24-h-ahead forecasts are made for the winter, spring, summer and fall seasons. Moreover, the performance and reliability of the seven models are then evaluated by a number of statistical measures. Among the direct approaches employed, empirical results indicate that HFM is relatively less accurate and RBFN is relatively more reliable for the weather forecasting problem. In comparison, the ensemble of neural networks produced the most accurate forecasts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2a-c
Fig. 3
Fig. 4
Fig. 5a-c

Similar content being viewed by others

References

  1. Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford, UK, pp 364-369

  2. Breiman L (1999) Combining predictors. In: Sharkey AJC (ed) Combining artificial neural nets: ensemble and modular multi-net systems. Springer, Berlin Heidelberg New York, pp 31-50

  3. Carney JG, Cunningham P (1999a) Confidence and prediction intervals for neural network ensembles. In: Proceedings of the international joint conference on neural networks (IJCNN’99), Washington DC, July 1999

  4. Carney JG, Cunningham P (1999b) Tuning diversity in bagged neural network ensembles. Technical report TCD-CS-1999-44, Trinity College, Dublin

  5. Cho SB, Ahn JH, Lee SI (2001) Exploiting diversity of neural ensembles with speciated evolution. In: Proceedings of the international joint conference on neural networks (IJCNN’01), Washington DC, July 2001, vol 2, pp 808-813

  6. Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal 12(10):993-1001

    Article  Google Scholar 

  7. Hartono P, Hashimoto S (2001) Learning from imperfect superior using neural network ensemble. IPSJ J 42(5):1214-1222

    Google Scholar 

  8. Islam MM, Shahjahan M, Murase K (2001) Exploring constructive algorithms with stopping criteria to produce accurate and diverse individual neural networks in an ensemble. In: Proceedings of the IEEE international conference on systems man and Cybernetics, Tucson, Arizona, October 2001, vol 3, pp 1526-1531

  9. Jiang Y, Zhou ZH, Chen ZQ (2002) Rule learning based on neural network ensemble. In: Proceedings of the international joint conference on neural networks (IJCNN’02), Honolulu, Hawaii, May 2002, vol 2, pp 1416-1420

  10. Jimenez D, Walsh N (1998) Dynamically weighted ensemble neural networks for classification. In: Proceedings of the international joint conference on neural networks (IJCNN’98), Anchorage, Alaska, May 1998, pp 753-756

  11. Khan MR, Ondrusek C (2000) Short-term electric demand prognosis using artificial neural networks. Electr Eng 51:296-300

    Google Scholar 

  12. Krogh A, Vedelsby J (1995) Neural network ensembles, cross validation and active learning. In: Tesauro G, Touretzky DS, Keen TK (eds) Neural information processing systems, vol 7. MIT Press, Cambridge, Massachusetts, pp 231-238

  13. Kuligowski RJ, Barros AP (1998) Localized precipitation forecasts from a numerical weather prediction model using artificial neural networks. Weather Forecast 13:1194-1205

    Article  Google Scholar 

  14. Liu Y, Yao X (1997) Evolving modular neural networks which generalize well. In: Proceedings of the IEEE international conference on evolutionary computation (ICEC’97), Indianapolis, Indiana, April 1997, pp 605-610

  15. Liu Y, Yao X, Higuchi T (2000) Evolutionary ensembles with negative correlation learning. IEEE Trans Evolut Comput 4(4):380-387

    Google Scholar 

  16. Lorenz EN (1969) Three approaches to atmospheric predictability. Bull Am Meteorol Soc 50:345-349

    Google Scholar 

  17. Mao J (1998) A case study on bagging, boosting and basic ensembles of neural networks for OCR. In: Proceedings of the joint conference on neural networks (IJCNN’98), Anchorage, Alaska, May 1998, vol 3, pp 1828-1833

  18. Maqsood I, Khan MR, Abraham A (2002a) Intelligent weather monitoring systems using connectionist models. Neural Parallel Sci Comput 10:157-178

    Google Scholar 

  19. Maqsood I, Khan MR, Abraham A (2002b) Neurocomputing based Canadian weather analysis. In: Proceedings of the 2nd international workshop on intelligent systems design and applications (ISDA’02), Atlanta, Georgia, August 2002. Dynamic Publishers, Atlanta, Georgia, pp 39-44

  20. Moro QI, Alonso L, Vivaracho CE (1994) Application of neural networks to weather forecasting with local data. In: Proceedings of the 12th IASTED international conference on applied informatics, Annecy, France, May 1994, pp 68-70

  21. Perrone MP, Cooper LN (1993) When networks disagree: ensemble methods for hybrid neural networks. In: Mammone RJ (ed) Neural networks for speech and image processing. Chapman-Hall, London

  22. Rosen BE (1996) Ensemble learning using decorrelated neural networks. Connect Sci 8(3-4):373-384

    Google Scholar 

  23. Sharkey AJC (1999) Combining artificial neural nets: ensemble and modular multi-net systems. Springer, Berlin Heidelberg New York

    MATH  Google Scholar 

  24. Shimshoni Y, Intrator N (1998) Classification of seismic signals by integrating ensembles of neural networks. IEEE Trans Signal Proces 46(5):1194-1201

    Article  Google Scholar 

  25. Sollich P, Krogh A (1996) Learning with ensembles: how over-fitting can be useful. In: Touretzky DS, Mozer MC, Hasselmo ME (eds) Advances in neural information processing systems 8. MIT Press, Cambridge, Massachusetts, pp 190-196

    Google Scholar 

  26. Tumer K, Ghosh J (1996) Error correlation and error reduction in ensemble classifiers. Connect Sci (special issue on combining artificial neural networks: ensemble approaches) 8(3-4):385-404

  27. Zhou ZH, Wu J, Tang W (2002) Ensembling neural networks: many could be better than all. Artif Intell 137(1-2):239-263

    Google Scholar 

  28. Zurada JM (1992) Introduction to artificial neural systems. West Publishing Company, Saint Paul, Minnesota

Download references

Acknowledgements

The authors would like to thank the staff of Environment Canada for the provision of the weather information as needed for this study. The authors are grateful for the comments by the anonymous reviewers, which helped to improve the presentation of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Imran Maqsood.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Maqsood, I., Khan, M.R. & Abraham, A. An ensemble of neural networks for weather forecasting. Neural Comput & Applic 13, 112–122 (2004). https://doi.org/10.1007/s00521-004-0413-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-004-0413-4

Keywords

Navigation