Skip to main content

Optimized Neural Network Ensemble by Combination of Particle Swarm Optimization and Differential Evolution

  • Conference paper
Advances in Neural Networks – ISNN 2013 (ISNN 2013)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7951))

Included in the following conference series:

Abstract

The Neural-Network Ensemble (NNE) is a very effective method where the outputs of separately trained neural networks are combined to perform the prediction. In this paper, we introduce the improved Neural Network Ensemble (INNE) in which each component forward neural network (FNN) is optimized by particle swarm optimization (PSO) and back-propagation (BP) algorithm. At the same time, the ensemble weights are trained by Particle Swarm Optimization and Differential Evolution cooperative algorithm(PSO-DE). We take two obviously different populations to construct our algorithm, in which one population is trained by PSO and the other is trained by DE. In addition, we incorporate the fitness value from last iteration into the velocity updating to enhance the global searching ability. Our experiments demonstrate that the improved NNE is superior to existing popular NNE.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Analysis and Machine Intelligence 12(10), 993–1001 (1990)

    Article  Google Scholar 

  2. Perrone, M.P., Cooper, L.N.: When networks disagree: Ensemble methods for hybrid neural networks. Neural Networks for Speech and Image Processing. pp. 126–142 (1993)

    Google Scholar 

  3. Zhou, Z.-H., Wu, J., Tang, W.: Ensembling Neural Networks: Many Could Be Better Than All. Artificial Intelligence 137(1-2), 239–263 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  4. Kennedy, J., Eberhart, R.: A discrete binary version of the particle swarm optimization. In: Proceedings IEEE International Conference on Computational Cybernatics and Simulation, Piscataway, pp. 4104–4108 (1997)

    Google Scholar 

  5. Gori, M., Tesi, A.: On the problem of local minima in back-propagation. IEEE Trans. Pattern Anal. Mach. Intell. 14(1), 76–86 (1992)

    Article  Google Scholar 

  6. Poli, R., Kennedy, J., Blackwell, T.: Particle swarm optimization. Swarm Intell. 1, 33–57 (2007), doi:10.1007/s11721-007-0002-0(2007)

    Article  Google Scholar 

  7. Zhao, Z., Wang, J., Tian, Q., Cao, M.: Particle Swarm-Differential Evolution Cooperative Optimized Particle Filter. In: ICICIP, pp. 485–490 (2010)

    Google Scholar 

  8. Zhang, J., Zhang, J., Lok, T., Lyu, M.R.: A hybird particle swarm optimization-back-propagation algorithm for feedward neural netwrok train. Applied Mathematics and Computation 185, 1026–1037 (2007)

    Article  MATH  Google Scholar 

  9. Yuan, H., Zhi, J., Liu, J.: Application of particle swarm optimization algorithm-based fuzzy BP neural network for target damage assessment. Scientific Research and Essays 6(15), 3109–3121 (2011)

    Google Scholar 

  10. Werbos, P.J.: Beyond regression: New tools for predictions and analysis in the behavioral science. Ph.D. Thesis, Harvard University (1974)

    Google Scholar 

  11. Vogl, T.P., Mangis, J.K., Rigler, A.K., Zink, W.T., Alkon, D.L.: Accelerating the convergence of the back-propagation method. Biological Cybernetics 59, 257–263 (1988)

    Article  Google Scholar 

  12. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proc. IEEE Conf. Neural Networks, Piscataway, pp. 1942–1948 (1995)

    Google Scholar 

  13. Optiz, D., Shavlik, J.: Actively searching for an effectively neural network ensemble. Connection Science 8(34), 337–353 (1996)

    Article  Google Scholar 

  14. Valentini, G., Masulli, F.: Ensembles of Learning Machines. In: Marinaro, M., Tagliaferri, R. (eds.) WIRN VIETRI 2002. LNCS, vol. 2486, pp. 3–20. Springer, Heidelberg (2002)

    Google Scholar 

  15. Storn, R., Price, K.: Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optimization 11(4), 341–359 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  16. Das, S., Abraham, A., Konar, A.: Particle Swarm Optimization and Differential Evolution Algorithms: Technical Analysis, Applications and Hybridization Perspectives. In: Liu, Y., Sun, A., Loh, H.T., Lu, W.F., Lim, E.-P. (eds.) Advances of Computational Intelligence in Industrial Systems. SCI, vol. 116, pp. 1–38. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  17. Luitel, B., Venayagamoorthy, G.K.: Differential Evolution Particle Swarm Optimization for Digital Filter Design. In: Proc. 2008 IEEE Congress on Evolutionary Computation (CEC 2008), Crystal city, Washington, DC, USA, pp. 3954–3961 (July 2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhao, ZS., Feng, X., Wei, F., Wang, SK., Cao, MY., Hou, ZG. (2013). Optimized Neural Network Ensemble by Combination of Particle Swarm Optimization and Differential Evolution. In: Guo, C., Hou, ZG., Zeng, Z. (eds) Advances in Neural Networks – ISNN 2013. ISNN 2013. Lecture Notes in Computer Science, vol 7951. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39065-4_45

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-39065-4_45

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-39064-7

  • Online ISBN: 978-3-642-39065-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics