Skip to main content
Log in

Multilevel framework for large-scale global optimization

  • Methodologies and Application
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

Large-scale global optimization (LSGO) algorithms are crucially important to handle real-world problems. Recently, cooperative co-evolution (CC) algorithms have successfully been applied for solving many large-scale practical problems. Many applications have imbalanced subcomponents where the size of subcomponents and their contribution to the objective function value are different. CC algorithms often lose their efficiency on LSGO problems with the imbalanced subcomponents; since they do not consider the imbalance aspect of variables. In this paper, we propose a multilevel optimization framework based on variables effect (called MOFBVE) which optimizes several subcomponents of the most important variables at earlier stages of optimization procedure before optimizing the problem with the original search space at its last stage. Sensitivity analysis (SA) method determines how the variation in the outputs of the model can be influenced by the variation of its input parameters. MOFBVE computes the main effect of variables using an SA method, Morris screening, and then it employs the k-means clustering method to construct groups including variables with the similar effects on the fitness value. The constructed groups are sorted in the descending order based on their contribution on the fitness value and the top groups are selected as the levels of the important variables. MOFBVE can reduce the complexity of search space to work with a simplified model to achieve an efficient exploration. The performance of MOFBVE is benchmarked on the imbalanced LSGO problems, i.e., two individually modified CEC-2010 and the CEC-2013 LSGO benchmark functions. The simulated experiments confirmed that MOFBVE obtains a promising performance on the majority of the imbalanced LSGO test functions. Also, MOFBVE is compared with state-of-the-art CC algorithms; and the results show that it is better than or at least comparable to CC algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Andrea S, Karen C, Marian Scott E et al (2000) Sensitivity analysis, vol 134. Wiley, New York

  • Andrea S, Marco R, Terry A, Francesca C, Jessica C, Debora G, Michaela S, Stefano T (2008) Global sensitivity analysis: the primer. Wiley, New York

  • Benjamin D, Dirk S, Carsten W (2013) When do evolutionary algorithms optimize separable functions in parallel? In: Proceedings of the twelfth workshop on Foundations of genetic algorithms XII, pp 51–64. ACM

  • Daniel M, Manuel L, Francisco H (2010) Ma-sw-chains: Memetic algorithm based on local search chains for large scale continuous global optimization. In: Evolutionary Computation (CEC), 2010 IEEE Congress on, IEEE, pp 1–8

  • Eman S, Daryl E, Ruhul S (2012a) Dependency identification technique for large scale optimization problems. In: Evolutionary Computation (CEC), 2012 IEEE Congress on, IEEE, pp 1–8

  • Eman S, Daryl E, Ruhul S (2012b) Using hybrid dependency identification with a memetic algorithm for large scale optimization problems. In: Simulated evolution and learning, Springer, pp 168–177

  • Francesca C, Jessica C, Andrea S (2007) An effective screening design for sensitivity analysis of large models. Environ Model Softw 22(10):1509–1518

    Article  Google Scholar 

  • Frank W (1945) Individual comparisons by ranking methods. Biometr Bull, pp 80–83

  • Frans Van den B, Andries PE (2004) A cooperative approach to particle swarm optimization. Evol Comput IEEE Trans 8(3):225–239

  • Hanning C, Yunlong Z, Kunyuan H, Xiaoxian H, Ben N (2008) Cooperative approaches to bacterial foraging optimization. In: Advanced intelligent computing theories and applications. with aspects of artificial intelligence, Springer, NY, pp 541–548

  • Hemant Kumar S, Tapabrata R (2010) Divide and conquer in coevolution: A difficult balancing act. In: Agent-based evolutionary search, Springer, pp 117–138

  • Herschel R, Ömer FA (1999) General foundations of high-dimensional model representations. J Math Chem 25(2–3):197–233

  • Hui W, Shahryar R, Zhijian W (2013) Parallel differential evolution with self-adapting control parameters and generalized opposition-based learning for solving high-dimensional optimization problems. J Parallel Distrib Comput 73(1):62–73

    Article  Google Scholar 

  • Jasbir A (2004) Introduction to optimum design. Academic Press, New York

  • Jinpeng L, Ke T (2013) Scaling up covariance matrix adaptation evolution strategy using cooperative coevolution. In: Intelligent data engineering and automated learning–IDEAL 2013, Springer, pp 350–357

  • Joaquín D, Salvador G, Daniel M, Francisco H (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18

    Article  Google Scholar 

  • Ke T, Xiaodong L, Ponnuthurai Nagaratnam S, Zhenyu Y, Thomas W (2010) Benchmark functions for the CEC’2010 special session and competition on large-scale global optimization. Technical report, Nature Inspired Computation and Applications Laboratory (NICAL),USTC, China. http://www.it-weise.de/documents/files/TLSYW2009BFFTCSSACOLSGO.pdf

  • Liang S, Shinichi Y, Xiaochun C, Yanchun L (2012) A cooperative particle swarm optimizer with statistical variable interdependence learning. Inf Sci 186(1):20–39

    Article  MathSciNet  Google Scholar 

  • MacQueen J et al (1967) Some methods for classification and analysis of multivariate observations. In: Proceedings of the fifth Berkeley symposium on mathematical statistics and probability., vol 1. California, pp 281–297

  • Max DM (1991) Factorial sampling plans for preliminary computational experiments. Technometrics 33(2):161–174

    Article  Google Scholar 

  • Mitchell AP, de Kenneth AD (1994) A cooperative coevolutionary approach to function optimization. In: Parallel problem solving from naturePPSN III, Springer, pp 249–257

  • Mitchell AP (1997) The design and analysis of a computational model of cooperative coevolution. PhD thesis, Citeseer

  • Mohammad Nabi O, Xiaodong L, Ke T (2015) Designing benchmark problems for large-scale continuous optimization. Inf Sci 316:419–436

  • Mohammad Nabi O, Xiaodong L, Xin Y (2011) Smart use of computational resources based on contribution for cooperative co-evolutionary algorithms. In: Proceedings of the 13th annual conference on genetic and evolutionary computation, ACM, pp 1115–1122

  • Mohammad Nabi O, Xiaodong L, Yi M, Xin Y (2014a) Cooperative co-evolution with differential grouping for large scale optimization. Evolutionary Computation, IEEE Transactions on 18(3): 378–393

  • Mohammad Nabi O, Yi M, Xiaodong L (2014b) Effective decomposition of large-scale separable continuous functions for cooperative co-evolutionary algorithms. In: Evolutionary computation (CEC), 2014 IEEE Congress on, IEEE, pp 1305–1312

  • Sedigheh M, Mohammad Ebrahim S, Shahryar R (2015) Metaheuristics in large-scale global continues optimization: a survey. Inf Sci 295:407–428

  • Sedigheh M, Mohammad Ebrahim S, Shahryar R (2014) Cooperative co-evolution with a new decomposition method for large-scale optimization. In: Evolutionary Computation (CEC), 2014 IEEE Congress on, IEEE, pp 1285–1292

  • Singiresu SR, Rao SS (2009) Engineering optimization: theory and practice. Wiley, New York

  • Wenxiang C, Thomas W, Zhenyu Y, Ke T (2010) Large-scale global optimization using cooperative coevolution with variable interaction learning. In: Parallel Problem Solving from Nature, PPSN XI, Springer, Heidelberg, pp 300–309

  • Xiaodong L, Ke T, Mohammad NO, Zhenyu Y, Kai Q (2013) Benchmark functions for the cec 2013 special session and competition on large-scale global optimization. Gene 7(33):8

  • Xiaodong L, Xin Y (2009) Tackling high dimensional nonseparable optimization problems by cooperatively coevolving particle swarms. In: Evolutionary computation, 2009. CEC’09. IEEE Congress on, pp 1546–1553. IEEE

  • Xiaodong L, Xin Y (2012) Cooperatively coevolving particle swarms for large scale optimization. Evol Comput IEEE Trans 16(2):210–224

    Article  Google Scholar 

  • Yoel T, Chi-Keong G (2010) Computational intelligence in expensive optimization problems, vol 2. Springer, New York

  • Yong L, Xin Y, Qiangfu Z, Tetsuya H (2001) Scaling up fast evolutionary programming with cooperative coevolution. In: Evolutionary Computation, 2001. Proceedings of the 2001 Congress on, IEEE. vol 2, pp 1101–1108

  • Yuanfang Ren, Yan Wu (2013) An efficient algorithm for high-dimensional function optimization. Soft Comput 17(6):995–1004

    Article  Google Scholar 

  • Zhao S-Z, Ponnuthurai Nagaratnam S, Swagatam D (2011) Self-adaptive differential evolution with multi-trajectory search for large-scale optimization. Soft Comput 15(11):2175–2185

  • Zhenyu Y, Ke T, Xin Y (2008a) Large scale evolutionary optimization using cooperative coevolution. Inf Sci 178(15):2985–2999

    Article  MathSciNet  MATH  Google Scholar 

  • Zhenyu Y, Ke T, Xin Y (2008b) Multilevel cooperative coevolution for large scale optimization. In: Evolutionary computation, 2008. CEC 2008. IEEE World Congress on Computational Intelligence). IEEE Congress on, IEEE, pp 1663–1670

  • Zhenyu Y, Ke T, Xin Y (2008c) Self-adaptive differential evolution with neighborhood search. In: Evolutionary computation, 2008. CEC 2008. (IEEE World Congress on Computational Intelligence). IEEE Congress on, IEEE, pp 1110–1116

Download references

Acknowledgments

The authors would like to thank anonymous reviewers for their constructive comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shahryar Rahnamayan.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Communicated by V. Loia.

Appendices

Appendix A

The the normal coefficient

Tables 22, 23, and 24 present the normal coefficient corresponding to nonseparable subcomponents in the modified normal CEC-2010 test functions.

Appendix B

The benchmark functions

  • CEC-2010 benchmark functions

Dimension: \(D = 1000\)

Group size: \(m = 50\)

\(x = (x_1, x_2, \dots , x_D)\): The candidate solution

\(o = (o_1, o_2, \dots , o_D)\): The (shifted) global optimum

\(z = x - o, z = (z_1, z_2, \dots , z_D)\): The shifted candidate solution

P: A random permutation of \({1, 2, \dots ,D}\)

$$\begin{aligned} F_\mathrm{elliptic}(x)=\sum _{\begin{array}{c} i=1 \end{array} }^{D} (10^6)\frac{i-1}{D-1} {x_i}^2 \end{aligned}$$
$$\begin{aligned} F_{\mathrm{rosenbrock}}=\sum _{\begin{array}{c} i=1 \end{array}}^{D-1}[100({x_i}^2-x_{i+1})^2+(x_i-1)^2] \end{aligned}$$
$$\begin{aligned} F_{\mathrm{rastrigin}}=\sum _{\begin{array}{c} i=1 \end{array}}^{D}[{x_i}^2-10\cos (2\pi )x_i+10] \end{aligned}$$
$$\begin{aligned} F_{\mathrm{ackley}}= & {} -20\mathrm{exp}\left( -0.2\sqrt{\frac{1}{D}\sum _{\begin{array}{c} i=1 \end{array}}^D{x_i}^2}\right) \\&-\,\mathrm{exp}\left( \frac{1}{D}\right) \sum _{\begin{array}{c} i=1 \end{array}}^D\cos (2\pi x_i))+20+c \end{aligned}$$
$$\begin{aligned} F_{\mathrm{schwefel}}=\sum _{\begin{array}{c} i=1 \end{array}}^{D}\left( \sum _{\begin{array}{c} j=1 \end{array}}^{i}x_i \right) ^2 \end{aligned}$$
$$\begin{aligned} F_{9}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{2m}}F_{\mathrm{rot}\_\mathrm{elliptic}}[z(P_{(k-1)*m+1} : P_{k*m})]\\&+F_{\mathrm{rot}_\mathrm{elliptic}}[z(P_{\frac{D}{2}+1} : P_D)] \end{aligned}$$
$$\begin{aligned} F_{10}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{2m}}F_{\mathrm{rot}\_\mathrm{rastrigin}}[z(P_{(k-1)*m+1} : P_{k*m})]\\&+F_{\mathrm{rastrigin}}[z(P_{\frac{D}{2}+1} : P_D)] \end{aligned}$$
$$\begin{aligned} F_{11}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{2m}}F_{\mathrm{rot}\_\mathrm{ackley}}[z(P_{(k-1)*m+1} : P_{k*m})]\\&+F_{\mathrm{ackley}}[z(P_{\frac{D}{2}+1} : P_D)] \end{aligned}$$
$$\begin{aligned} F_{12}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{2m}}F_{\mathrm{schwefel}}[z(P_{(k-1)*m+1} : P_{k*m})]\\&+F_\mathrm{sphere}[z(P_{\frac{D}{2}+1} : P_D)] \end{aligned}$$
$$\begin{aligned} F_{13}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{2m}}F_{\mathrm{rot}\_\mathrm{rosenbrock}}[z(P_{(k-1)*m+1} : P_{k*m})]\\&+F_\mathrm{sphere}[z(P_{\frac{D}{2}+1} : P_D)] \end{aligned}$$
$$\begin{aligned} F_{14}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{m}}F_{\mathrm{rot}\_\mathrm{elliptic}}[z(P_{(k-1)*m+1} : P_{k*m})] \end{aligned}$$
$$\begin{aligned} F_{15}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{m}}F_{\mathrm{rot}\_\mathrm{rastrigin}}[z(P_{(k-1)*m+1} : P_{k*m})] \end{aligned}$$
$$\begin{aligned} F_{16}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{m}}F_{\mathrm{rot}\_\mathrm{ackley}}[z(P_{(k-1)*m+1} : P_{k*m})] \end{aligned}$$
$$\begin{aligned} F_{17}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{m}}F_{\mathrm{schwefel}}[z(P_{(k-1)*m+1} : P_{k*m})] \end{aligned}$$
$$\begin{aligned} F_{18}(x)= & {} \sum _{\begin{array}{c} k=1 \end{array}}^{\frac{D}{m}}F_{\mathrm{rosenbrock}}[z(P_{(k-1)*m+1} : P_{k*m})] \end{aligned}$$
  • CEC-2013 benchmark functions

Dimension: \(D = 1000\)

Group size: \(m = 50\)

$$\begin{aligned} S = {50, 25, 25, 100, 50, 25, 25, 700} \end{aligned}$$
$$\begin{aligned} S1= & {} \{50, 50, 25, 25, 100, 100, 25, 25, 50, 25, 100,\\&25, 100, 50, 25, 25, 25, 100, 50, 25\} \end{aligned}$$

\(x^\mathrm{opt}\) : The optimum decision vector

P: A random permutation of \({1, 2, \dots ,D}\)

\(T_\mathrm{osz}\): A transformation function to create smooth local irregularities.

\(T_\mathrm{asy}\): A transformation function to break the symmetry of the symmetric functions.

\(\lambda \): A D-dimensional diagonal matrix with the diagonal elements is used to create ill-conditioning.

R: An orthogonal rotation matrix which is used to rotate the fitness landscape randomly around various axes

m: The overlap size between subcomponents

$$\begin{aligned} y = x - x^\mathrm{opt} \end{aligned}$$
$$\begin{aligned} y_i = y(P_{[C_{i-1}+1]} : P_{[C_i]}) \quad i \in {1, \ldots , |S|}, \end{aligned}$$
$$\begin{aligned} y_{i1} = y(P_{[C_{i-1}-(i-1)m+1]} : P_{[C_i-(i-1)m]}) \quad i \in {1, \ldots , |S|}, \end{aligned}$$
$$\begin{aligned} y_{i2}= & {} y(P_{[C_{i-1}-(i-1)m+1]} : P_{[C_i-(i-1)m]})\\&-x_i^\mathrm{opt} \quad i \in {1, \ldots , |S|}, \end{aligned}$$
$$\begin{aligned} z_i = T_\mathrm{osz}(R_iy_i), \quad i \in {1, \ldots , |S| - 1} \end{aligned}$$
$$\begin{aligned} z_{i1} = T_\mathrm{asy}^{0.2}T_\mathrm{osz}(R_iy_{i1}),\quad i \in {1, \ldots , |S| - 1} \end{aligned}$$
$$\begin{aligned} z_{i2} = T_\mathrm{asy}^{0.2}T_\mathrm{osz}(R_iy_{i2}), \quad i \in {1, \ldots , |S| - 1} \end{aligned}$$
$$\begin{aligned} z_{|S|}= T_\mathrm{osz}(y_{|S|}) \end{aligned}$$
$$\begin{aligned} R_i: a |S_i| \times |S_i| \end{aligned}$$
$$\begin{aligned} F_{4}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s|-1}w_i f_\mathrm{elliptic}(z_i)+f_\mathrm{elliptic}(z_{|s|}), \end{aligned}$$
$$\begin{aligned} F_{5}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s|-1}w_i f_{\mathrm{rastrigin}}(z_i)+f_{\mathrm{rastrigin}}(z_{|s|}), \end{aligned}$$
$$\begin{aligned} F_{6}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s|-1}w_i f_{\mathrm{ackley}}(z_i)+f_{\mathrm{ackley}}(z_{|s|}), \end{aligned}$$
$$\begin{aligned} F_{7}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s|-1}w_i f_{\mathrm{schwefel}}(z_i)+f_{\mathrm{schwefel}}(z_{|s|}), \end{aligned}$$
$$\begin{aligned} F_{8}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s1|}w_i f_\mathrm{elliptic}(z_i), \end{aligned}$$
$$\begin{aligned} F_{9}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s1|}w_i f_{\mathrm{rastrigin}}(z_i), \end{aligned}$$
$$\begin{aligned} F_{10}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s1|}w_i f_{\mathrm{ackley}}(z_i), \end{aligned}$$
$$\begin{aligned} F_{11}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s1|}w_i f_{\mathrm{schwefel}}(z_i), \end{aligned}$$
$$\begin{aligned} F_{13}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s1|}w_i f_{\mathrm{schwefel}}(z_{i1}), \end{aligned}$$
$$\begin{aligned} F_{14}(x)=\sum _{\begin{array}{c} k=1 \end{array}}^{|s1|}w_i f_{\mathrm{schwefel}}(z_{i2}), \end{aligned}$$

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mahdavi, S., Rahnamayan, S. & Shiri, M.E. Multilevel framework for large-scale global optimization. Soft Comput 21, 4111–4140 (2017). https://doi.org/10.1007/s00500-016-2060-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-016-2060-y

Keywords

Navigation