Skip to main content

MLP, Gaussian Processes and Negative Correlation Learning for Time Series Prediction

  • Conference paper
Multiple Classifier Systems (MCS 2009)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 5519))

Included in the following conference series:

Abstract

Time series forecasting is a challenging problem, that has a wide variety of application domains such as in engineering, environment, finance and others. When confronted with a time series forecasting application, typically a number of different forecasting models are tested and the best one is considered. Alternatively, instead of choosing the single best method, a wiser action could be to choose a group of the best models and then to combine their forecasts. In this study we propose a combined model consisting of Multi-layer perceptron (MLP), Gaussian Processes Regression (GPR) and a Negative Correlation Learning (NCL) model. The MLP and the GPR were the top performers in a previous large scale comparative study. On the other hand, NCL suggests an alternative way for building accurate and diverse ensembles. No studies have reported on the performance of the NCL in time series prediction. In this work we test the efficiency of NCL in predicting time series data. Results on two real data sets show that the NCL is a good candidate model for forecasting time series. In addition, the study also shows that the combined MLP/GPR/NCL model outperforms all models under consideration.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Brockwell, P.J., Davis, R.A.: Introduction to Time Series and Forecasting. Springer, Heidelberg (2002)

    Book  MATH  Google Scholar 

  2. Verdes, P.F., Granitto, P.M., Navone, H.D., Ceccatto, H.A.: Frost prediction with machine learning techniques. In: The VI th Argentine Congress on Computer Science (2000)

    Google Scholar 

  3. Lendasse, A., Francois, D., Wertz, V., Verleysen, M.: Vector quantization: a weighted version for time-series forecasting. Future Gener. Comput. Syst. 21(7), 1056–1067 (2005)

    Article  Google Scholar 

  4. Cai, X., Zhang, N., Venayagamoorthy, G.K., Donald, I., Wunsch, C.: Time series prediction with recurrent neural networks trained by a hybrid pso-ea algorithm. Neurocomput. 70(13-15), 2342–2353 (2007)

    Article  Google Scholar 

  5. Cellier, F., Nebot, A.: Multi-resolution time-series prediction using fuzzy inductive reasoning. In: Proceedings of the IJCNN 2004: International Joint Conference on Neural Networks, Budapest, Hungria, vol. 2, pp. 1621–1624 (2004)

    Google Scholar 

  6. Faming, L.: Bayesian neural networks for nonlinear time series forecasting. Statistics and Computing 15(17), 13–29 (2005)

    MathSciNet  Google Scholar 

  7. Cheng, H., Tan, P.-N., Gao, J., Scripps, J.: Multistep-ahead time series prediction. In: Ng, W.-K., Kitsuregawa, M., Li, J., Chang, K. (eds.) PAKDD 2006. LNCS, vol. 3918, pp. 765–774. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  8. Ahmed, N.K., Atiya, A., Gayar, N.E., El-Shishiny, H.: An empirical comparison of machine learning models for time series forecasting. Econometric Reviews (2009) (accepted for publication)

    Google Scholar 

  9. Chan, P.P., Zeng, X., Tsang, E.C., Yeung, D.S., Lee, J.W.: Neural network ensemble pruning using sensitivity measure in web applications. In: IEEE International Conference on Systems, Man and Cybernetics, 2007. ISIC, pp. 3051–3056 (2007)

    Google Scholar 

  10. Brown, G., Yao, X.: On the effectiveness of negative correlation learning. In: Proceedings of First UK Workshop on Computational Intelligence, pp. 57–62 (2001)

    Google Scholar 

  11. Wichard, J., Ogorzalek, M.: Time series prediction with ensemble models. In: International Joint Conference on Neural Networks (IJCNN), Budapest, pp. 1625–1629.

    Google Scholar 

  12. Kuncheva, L.: Combining pattern classifiers: Methods & Algorithms. Wiley-interscience, Hoboken (2004)

    Book  MATH  Google Scholar 

  13. Navone, H.D., Verdes, P.F., Granitto, P.M., Ceccatto, H.A.: A learning algorithm for neural networks ensembles. In: ASAI 2000: Proceedings of the Argentine Symposium on Artificial Intelligence, vol. 12, pp. 70–74 (2001)

    Google Scholar 

  14. Ahmed, N., Atiya, A., Gayar, N.E., El-Shishiny, H.: A combined neural network/gaussian process regression time series forecasting system for the nn3 competition (2007), http://www.neural-forecasting-competition.com/NN3/results.htm

  15. Andrawis, R.R., Atiya, A., El-Shishiny, H.: Forecast combination model using computational intelligence/linear models for the nn5 time series forecasting competition (2008), http://www.neural-forecasting-competition.com/results.htm

  16. Liu, Y., Yao, X., Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation 4, 380–387 (2000)

    Article  Google Scholar 

  17. Lin, M., Tang, K., Yao, X.: Selective negative correlation learning algorithm for incremental learning. In: IJCNN, pp. 2525–2530 (2008)

    Google Scholar 

  18. Chan, Z., Kasabov, N.: Fast neural network ensemble learning via negative-correlation data correction. IEEE Transactions, Neural Networks 16, 1707–1710 (2005)

    Article  Google Scholar 

  19. Eastwood, M., Gabrys, B.: Lambda as a complexity control in negative correlation learning. In: NiSIS 2006 Symposium: 2nd European Symposium on Nature-inspired Smart Information Systems (2006)

    Google Scholar 

  20. Armstrong, J.S.: Combining forecasts. In: Principles of Forecasting: A Handbook for Researchers and Practitioners, pp. 417–439. Kluwer Academic Publishers, Dordrecht (2001)

    Chapter  Google Scholar 

  21. Boyle, P., Frean, M.: Dependent gaussian processes. In: Advances in Neural Information Processing Systems, vol. 17, pp. 217–224. MIT Press, Cambridge

    Google Scholar 

  22. Rasmussen, C.E.: Gaussian processes for machine learning. MIT Press, Cambridge (2006)

    MATH  Google Scholar 

  23. Brown, G., Wyatt, J.L., Kaelbling, P.: Managing diversity in regression ensembles. Journal of Machine Learning Research 6 (2005)

    Google Scholar 

  24. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Azmy, W.M., El Gayar, N., Atiya, A.F., El-Shishiny, H. (2009). MLP, Gaussian Processes and Negative Correlation Learning for Time Series Prediction. In: Benediktsson, J.A., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2009. Lecture Notes in Computer Science, vol 5519. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02326-2_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02326-2_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02325-5

  • Online ISBN: 978-3-642-02326-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics