Skip to main content
Log in

An active learning brain storm optimization algorithm with a dynamically changing cluster cycle for global optimization

  • Published:
Cluster Computing Aims and scope Submit manuscript

Abstract

An active learning brain storm optimization (ALBSO) algorithm with a dynamically changing cluster cycle is proposed to enhance the performance of BSO algorithm. In the original BSO, a new individual is generated only by simply choosing one individual or two individuals from one cluster or two clusters with a certain probability. Although the diversity of the population is well maintained, owing to this way of random selection, it seriously affects the performance of BSO. In machine learning, instead of learning from random samples, a learner with active learning has the ability to select its own training data, such as the data of best value or mean value. Referring to the idea of active learning, the active learning strategy with intra-cluster learning and inter-cluster learning is integrated into BSO to elevate the solution accuracy and the convergence speed. In addition, to reduce the time burden of clustering operation, a dynamically changing cluster cycle with one-step clustering is adopted in the proposed ALBSO. Through experimental analysis by utilizing the active learning strategy with the dynamically changing cluster cycle, on the basis of the original diversity maintenance framework, the proposed ALBSO can balance the relationship of diversification and intensification, meanwhile, the time complexity is remarkably reduced. The experimental study on a set of 38 test functions and two real-world problems shows that the performance of BSO is significantly improved, and ALBSO is an effective method to optimize complex unimodal and multimodal functions in comparison with several other well-known algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Talbi, E.G.: Metaheuristics: From Design to Implementation. Wiley, Hoboken (2009)

    Book  MATH  Google Scholar 

  2. Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Network, pp. 1942–1948 (1995)

  3. Shi, Y.H., Eberhart, R.C.: A modified particle swarm optimizer. In: IEEE International Conference on Evolutionary Computation, Anchorage, Alaska, USA (1998)

  4. Dorigo, M., Maniezzo, V., Colorni, A.: Ant system: optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. B 26(2), 29–41 (1996)

    Article  Google Scholar 

  5. Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J. Glob. Optim. 39(3), 459–471 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  6. Yang, X.: Nature-Inspired Metaheuristic Algorithms. Luniver Press, Beckington (2008)

    Google Scholar 

  7. Passino, K.M.: Bacterial foraging optimization. Int. J. Swarm Intell. Res. 1(1), 1–16 (2010)

    Article  MathSciNet  Google Scholar 

  8. Jiang, Q.Y., Wang, L., Hei, X.H.: Parameter identification of chaotic systems using artificial raindrop algorithm. J. Comput. Sci. 8, 20–31 (2015)

    Article  MathSciNet  Google Scholar 

  9. Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)

    Article  Google Scholar 

  10. Suganthan, P.N., Hansen, N., Liang, J.J., et al.: Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization. Technical Report. Nanyang Technological University, Singapore (2005)

    Google Scholar 

  11. Shi, Y.H.: Brain storm optimization algorithm. In: Proceedings of the 2nd International Conference of Swarm Intelligence, pp. 303–309 (2011)

  12. Shi, Y.H.: An optimization algorithm based on brainstorming process. Int. J. Swarm Intell. Res. 2(44), 35–62 (2011)

    Article  Google Scholar 

  13. Sun, C.H., Duan, H.B., Shi, Y.H.: Optimal satellite formation reconfiguration based on closed-loop brain storm optimization. IEEE Comput. Intell. Mag. 8(4), 39–51 (2013)

    Article  Google Scholar 

  14. Duan, H.B., Li, S.T., Shi, Y.H.: Predator–prey based brain storm optimization for DC brushless motor. IEEE Trans. Magn. 49(10), 5336–5340 (2013)

    Article  Google Scholar 

  15. Jadhav, H.T., Utkarsh, S., Patel, J., et al.: Brain storm optimization algorithm based economic dispatch considering wind power. In: IEEE International Conference on Power and Energy, pp. 588–593 (2012)

  16. Shi, Y.H.: Multi-objective optimization based on brain storm optimization algorithm. Int. J. Swarm Intell. Res. Arch. 4(3), 1–21 (2013)

    Article  Google Scholar 

  17. Xue, J.Q., Wu, Y.L., Shi, Y.H., Cheng, S.: Brain storm optimization algorithm for multi-objective optimization problems. In: ICSI: Advances in Swarm Intelligence, pp. 513–519 (2012)

  18. Zhan, Z.H., Chen, W.N., Lin, Y., et al.: Parameter investigation in brain storm optimization. In: IEEE Symposium Series on Computational Intelligence (SIS), pp. 103–110 (2013)

  19. Zhan, Z.H., Zhang, J., Shi, Y.H., Liu, H.L.: A modified brain storm optimization. In: IEEE Congress on Evolutionary Computation, pp. 1–8 (2012)

  20. Cheng, S., Shi, Y.H., Qin, Q.D., et al.: Maintaining population diversity in brain storm optimization algorithm. In: IEEE Congress on Evolutionary Computation, pp. 3230–3237 (2014)

  21. Chen, J.F., Cheng, S., Chen, Y.: Enhanced brain storm optimization algorithm for wireless sensor networks deployment. In: 2015 International Conference in Swarm Intelligence, pp. 373–381 (2015)

  22. Sun, Y.Q.: A hybrid approach by integrating brain storm optimization algorithm with grey neural network for stock index forecasting. Abstr. Appl. Anal. (2014). https://doi.org/10.1155/2014/759862

    Article  MATH  Google Scholar 

  23. Li, J.N., Duan, H.B.: Simplified brain storm optimization approach to control parameter optimization in F/A-18 automatic carrier landing system. Aerosp. Sci. Technol. 42, 187–195 (2015)

    Article  Google Scholar 

  24. María, A.R., Miguel, A.V.R.: Cost optimization based on brain storming for grid scheduling. In: IEEE Fourth International Conference on Innovative Computing Technology (INTECH), pp. 31–36 (2014)

  25. Cheng, S., Qin, Q.D., Chen, J.F., Shi, Y.H.: Brain storm optimization algorithm: a review. Artif. Intell. Rev. 46(4), 445–458 (2016)

    Article  Google Scholar 

  26. Yang, Y.T., Shi, Y.H., Xia, S.R.: Advanced discussion mechanism-based brain storm optimization algorithm. Soft Comput. 19(10), 2997–3007 (2015)

    Article  Google Scholar 

  27. Osborn, A.F.: Applied Imagination: Principles and Procedures of Creative Problem Solving. Charles Scribner’s Son, New York (1963)

    Google Scholar 

  28. Settles, B.: Active Learning Literature Survey, vol. 52(11), pp. 55–66. University of Wisconsin, Madison (2010)

    Google Scholar 

  29. Cohn, D.A., Ghahramani, Z., Jordan, M.I.: Active learning with statistical models. J. Artif. Intell. Res. 4(1), 129–145 (1996)

    Article  MATH  Google Scholar 

  30. Rao, R.V., Savsani, V.J., Vakharia, D.P.: Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput. Aided Des. 43(3), 303–315 (2011)

    Article  Google Scholar 

  31. Rao, R.V., Savsani, V.J., Vakharia, D.P.: Teaching–learning-based optimization: an optimization method for continuous non-linear large scale problems. Inf. Sci. 183(1), 1–15 (2012)

    Article  MathSciNet  Google Scholar 

  32. Zou, F., Wang, L., Hei, X.H., et al.: Teaching–learning-based optimization with dynamic group strategy for global optimization. Inf. Sci. 273, 112–131 (2014)

    Article  Google Scholar 

  33. Yao, X., Liu, Y., Lin, G.M.: Evolutionary programming made faster. IEEE Trans. Evol. Comput. 3(2), 82–102 (1999)

    Article  Google Scholar 

  34. Sabat, S.L., Ali, L., Udgata, S.K.: Integrated learning particle swarm optimizer for global optimization. Appl. Soft Comput. 11, 574–584 (2011)

    Article  Google Scholar 

  35. Liang, J., Qin, A.K., Suganthan, P.N., Baskar, S.: Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 10(3), 281–295 (2006)

    Article  Google Scholar 

  36. Peram, T., Veeramachaneni, K., Mohan, C.K.: Fitness-distance-ratio based particle swarm optimization. In: Proceedings of the Swarm Intelligence Symposium, pp. 174–181 (2003)

  37. Kennedy, J., Mendes, R.: Population structure and particle swarm performance. In: IEEE International Conference on Evolutionary Computation, pp. 1671–1676 (2002)

  38. Gibbons, J.D., Chakraborti, S.: Nonparametric Statistical Inference, 5th edn. Chapman and Hall/CRC Press, Taylor and Francis Group, Boca Raton (2011)

    MATH  Google Scholar 

  39. García, S., Molina, D., Lozano, M., Herrera, F.: A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behavior: a case study on the CEC’2005 special session on real parameter optimization. J Heuristics 15, 617–644 (2009)

    Article  MATH  Google Scholar 

  40. Derrac, J., García, S., Molina, D., Herrera, F.: A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 1(1), 3–18 (2011)

    Article  Google Scholar 

  41. Alcalá-Fdez, J., Sánchez, L., García, S.: KEEL: a software tool to assess evolutionary algorithms to data mining problems. Soft Comput. 13(3), 307–318 (2008)

    Article  Google Scholar 

  42. Sandgren, E.: Nonlinear integer and discrete programming in mechanical design. In: Proceedings of the ASME Design Technology Conference, pp. 95–105 (1988)

  43. Das, S., Suganthan, P.N.: Problem Definitions and Evaluation Criteria for CEC 2011 Competition on Testing Evolutionary Algorithms on Real World Optimization Problem. Technical Report. Jadavpur University, Nanyang Technological University, Kolkata (2011)

    Google Scholar 

Download references

Acknowledgements

This research work was partially supported by National Natural Science Foundation of China (Grant Nos. 61773314, 61672027, 61703332, 61572392), Key Research and Development Plan Project of Shaanxi Science and Technology Department (Grant No. 2017ZDXM-GY-016), the Project of Department of Education Science Research of Shaanxi Province (Grant No. 17JK0371), and the Fund of National Laboratory of Network and Detection Control (Grant No. GSYSJ2016007).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zijian Cao.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cao, Z., Wang, L. An active learning brain storm optimization algorithm with a dynamically changing cluster cycle for global optimization. Cluster Comput 22, 1413–1429 (2019). https://doi.org/10.1007/s10586-019-02918-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10586-019-02918-0

Keywords

Navigation