Skip to main content

Enhancing Cooperative Coevolution with Surrogate-Assisted Local Search

  • Chapter
  • First Online:

Part of the book series: Studies in Computational Intelligence ((SCI,volume 637))

Abstract

In recent years, an increasing effort has been devoted to the study of metaheuristics suitable for large-scale global optimization in the continuous domain. However, so far the optimization of high-dimensional functions that are also computationally expensive has attracted little research. To address such an issue, this chapter describes an approach in which fitness surrogates are exploited to enhance local search (LS) within the low-dimensional subcomponents of a cooperative coevolutionary (CC) optimizer. The chapter also includes a detailed discussion of the related literature and presents a preliminary experimentation based on typical benchmark functions. According to the results, the surrogate-assisted LS within subcomponents can significantly enhance the optimization ability of a CC algorithm.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Abboud, K., Schoenauer, M.: Surrogate deterministic mutation: preliminary results. In: Collet, P., Fonlupt, C., Hao, J.K., Lutton, E., Schoenauer, M. (eds.) Artificial Evolution. Lecture Notes in Computer Science, vol. 2310, pp. 104–116. Springer, Berlin (2002)

    Google Scholar 

  2. Aguilar-Ruiz, J., Mateos, D., Rodriguez, D.: Evolutionary neuroestimation of fitness functions. In: Lecture Notes on Artificial Inteligence, vol. 2902, pp. 74–83 (2003)

    Google Scholar 

  3. Anderson, K., Hsu, Y.: Genetic crossover strategy using an approximation concept. In: Proceedings of the 1999 Congress on Evolutionary Computation, 1999. CEC 99. vol. 1, p. 533 (1999)

    Google Scholar 

  4. Bellman, R.: Dynamic Programming, 1st edn. Princeton University Press, Princeton (1957)

    MATH  Google Scholar 

  5. Bischl, B., Mersmann, O., Trautmann, H., Weihs, C.: Resampling methods for meta-model validation with recommendations for evolutionary computation. Evol. Comput. 20(2), 249–275 (2012)

    Article  Google Scholar 

  6. Blum, M., Riedmiller, M.A.: Optimization of Gaussian process hyperparameters using rprop. In: 21st European Symposium on Artificial Neural Networks, ESANN 2013, Bruges, Belgium, 24–26 April, 2013. https://www.elen.ucl.ac.be/esann/proceedings/papers.php?ann=2013

  7. Branke, J., Schmidt, C.: Fast convergence by means of fitness estimation. Soft Comput. J. (2003, in press)

    Google Scholar 

  8. Brest, J., Maucec, M.S.: Self-adaptive differential evolution algorithm using population size reduction and three strategies. Soft Comput. 15(11), 2157–2174 (2011)

    Article  Google Scholar 

  9. Bueche, D., Schraudolph, N., Koumoutsakos, P.: Accelerating evolutionary algorithms with Gaussian process fitness function models. IEEE Trans. Syst. Man, Cybern.: Part C 35(2), 183–194 (2004)

    Google Scholar 

  10. Carpenter, W., Barthelemy, J.F.: A comparison of polynomial approximation and artificial neural nets as response surface. Technical Report, 92–2247, AIAA (1992)

    Google Scholar 

  11. Celis, M., Dennis Jr., J., Tapia, R.: A trust region strategy for nonlinear equality constrained optimization. In: Proceedings of the SIAM Conference on Numerical Optimization, Boulder, CO (1984)

    Google Scholar 

  12. Chen, W., Weise, T., Yang, Z., Tang, K.: Large-scale global optimization using cooperative coevolution with variable interaction learning. In: Parallel Problem Solving from Nature. PPSN XI, Lecture Notes in Computer Science, vol. 6239, pp. 300–309. Springer, Berlin (2010)

    Google Scholar 

  13. Cheng, S., Ting, T., Yang, X.S.: Large-scale global optimization via swarm intelligence. In: Koziel, S., Leifsson, L., Yang, X.S. (eds.) Solving Computationally Expensive Engineering Problems. Springer Proceedings in Mathematics and Statistics, vol. 97, pp. 241–253. Springer, Berlin (2014)

    Google Scholar 

  14. Cheng, R., Jin, Y., Narukawa, K., Sendhoff, B.: A multiobjective evolutionary algorithm using Gaussian process based inverse modeling. IEEE Trans. Evol. Comput. PP(99), 1–1 (2015)

    Google Scholar 

  15. D’Ambrosio, D., Rongo, R., Spataro, W., Trunfio, G.A.: Meta-model assisted evolutionary optimization of cellular automata: an application to the sciara model. In: Parallel Processing and Applied Mathematics. Lecture Notes in Computer Science, vol. 7204, pp. 533–542. Springer, Berlin (2012)

    Google Scholar 

  16. D’Ambrosio, D., Rongo, R., Spataro, W., Trunfio, G.A.: Optimizing cellular automata through a meta-model assisted memetic algorithm. In: Parallel Problem Solving from Nature—PPSN XII. LNCS, vol. 7492, pp. 317–326. Springer, Berlin (2012)

    Google Scholar 

  17. Doerner, K., Hartl, R.F., Reimann, M.: Cooperative ant colonies for optimizing resource allocation in transportation. In: Proceedings of the EvoWorkshops on Applications of Evolutionary Computing, pp. 70–79. Springer, Berlin (2001)

    Google Scholar 

  18. El-Abd, M., Kamel, M.S.: A Taxonomy of cooperative particle swarm optimizers. Int. J. Comput. Intell. Res. 4 (2008)

    Google Scholar 

  19. El-Beltagy, M., Keane, A.: Evolutionary optimization for computationally expensive problems using Gaussian processes. In: Proceedings of International Conference on Artificial Intelligence, pp. 708–714. CSREA (2001)

    Google Scholar 

  20. Emmerich, M., Giotis, A., Özdenir, M., Bäck, T., Giannakoglou, K.: Metamodel-assisted evolution strategies. In: Parallel Problem Solving from Nature. Lecture Notes in Computer Science, vol. 2439, pp. 371–380. Springer, Berlin (2002)

    Google Scholar 

  21. Filho, F., Gomide, F.: Fuzzy clustering in fitness estimation models for genetic algorithms and applications. In: 2006 IEEE International Conference on Fuzzy Systems, pp. 1388–1395 (2006)

    Google Scholar 

  22. Giunta, A., Watson, L.: A comparison of approximation modeling techniques: polynomial versus interpolating models. Technical Report 98–4758, AIAA (1998)

    Google Scholar 

  23. Goh, C., Lim, D., Ma, L., Ong, Y., Dutta, P.: A surrogate-assisted memetic co-evolutionary algorithm for expensive constrained optimization problems. In: 2011 IEEE Congress on Evolutionary Computation (CEC), pp. 744–749 (2011)

    Google Scholar 

  24. Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  25. Hasanzadeh, M., Meybodi, M., Ebadzadeh, M.: Adaptive cooperative particle swarm optimizer. Appl. Intell. 39(2), 397–420 (2013)

    Article  Google Scholar 

  26. Hong, Y.S., Lee, H., Tahk, M.J.: Acceleration of the convergence speed of evolutionary algorithms using multi-layer neural networks. Eng. Optim. 35(1), 91–102 (2003)

    Google Scholar 

  27. Hüscken, M., Jin, Y., Sendhoff, B.: Structure optimization of neural networks for aerodynamic optimization. Soft Comput. J. 9(1), 21–28 (2005)

    Article  Google Scholar 

  28. Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. 9(1), 3–12 (2005)

    Article  Google Scholar 

  29. Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. J. 9(1), 3–12 (2005)

    Article  Google Scholar 

  30. Jin, Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011)

    Article  Google Scholar 

  31. Jin, Y., Olhofer, M., Sendhoff, B.: On evolutionary optimization with approximate fitness functions. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 786–792. Morgan Kaufmann (2000)

    Google Scholar 

  32. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6(5), 481–494 (2002)

    Article  Google Scholar 

  33. Jin, Y., Huesken, M., Sendhoff, B.: Quality measures for approximate models in evolutionary computation. In: Proceedings of GECCO Workshops: Workshop on Adaptation, Learning and Approximation in Evolutionary Computation, pp. 170–174. Chicago (2003)

    Google Scholar 

  34. Jin, Y., Sendhoff, B.: Reducing fitness evaluations using clustering techniques and neural networks ensembles. In: Genetic and Evolutionary Computation Conference. LNCS, vol. 3102, pp. 688–699. Springer, Berlin (2004)

    Google Scholar 

  35. Kim, H.S., Cho, S.B.: An efficient genetic algorithm with less fitness evaluation by clustering. In: Proceedings of the 2001 Congress on Evolutionary Computation, 2001, vol. 2, pp. 887–894 (2001)

    Google Scholar 

  36. King, D.E.: Dlib-ml: a machine learning toolkit. J. Mach. Learn. Res. 10, 1755–1758 (2009)

    Google Scholar 

  37. Lastra, M., Molina, D., Bentez, J.M.: A high performance memetic algorithm for extremely high-dimensional problems. Inf. Sci. 293, 35–58 (2015)

    Article  Google Scholar 

  38. LaTorre, A.: A framework for hybrid dynamic evolutionary algorithms: multiple offspring sampling (MOS). Ph.D. thesis, Universidad Politecnica de Madrid (2009)

    Google Scholar 

  39. LaTorre, A., Muelas, S., Peña, J.M.: A MOS-based dynamic memetic differential evolution algorithm for continuous optimization: a scalability test. Soft Comput. 15(11), 2187–2199 (2011)

    Article  Google Scholar 

  40. LaTorre, A., Muelas, S., Pena, J.M.: Multiple offspring sampling in large scale global optimization. In: 2012 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8 (2012)

    Google Scholar 

  41. LaTorre, A., Muelas, S., Pena, J.M.: Large scale global optimization: experimental results with MOS-based hybrid algorithms. In: IEEE Congress on Evolutionary Computation (CEC), 2013, pp. 2742–2749 (2013)

    Google Scholar 

  42. LaTorre, A., Muelas, S., Peña, J.M.: A comprehensive comparison of large scale global optimizers. Inf. Sci. 316, 517–549 (2015)

    Article  Google Scholar 

  43. Li, X., Yao, X.: Cooperatively coevolving particle swarms for large scale optimization. IEEE Trans. Evol. Comput. 16(2), 210–224 (2012)

    Article  MathSciNet  Google Scholar 

  44. Liu, B., Zhang, Q., Gielen, G.: A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems. IEEE Trans. Evol. Comput. 18(2), 180–192 (2014)

    Article  Google Scholar 

  45. Liu, J., Tang, K.: Scaling up covariance matrix adaptation evolution strategy using cooperative coevolution. In: Yin, H., Tang, K., Gao, Y., Klawonn, F., Lee, M., Weise, T., Li, B., Yao, X. (eds.) Intelligent Data Engineering and Automated Learning—IDEAL 2013. Lecture Notes in Computer Science, vol. 8206, pp. 350–357. Springer, Berlin (2013)

    Google Scholar 

  46. Liu, Y., Yao, X., Zhao, Q.: Scaling up fast evolutionary programming with cooperative coevolution. In: Proceedings of the 2001 Congress on Evolutionary Computation, Seoul, Korea, pp. 1101–1108 (2001)

    Google Scholar 

  47. Liu, W., Zhang, Q., Tsang, E., Virginas, B.: Fuzzy clustering based Gaussian process model for large training set and its application in expensive evolutionary optimization. In: IEEE Congress on Evolutionary Computation, 2009. CEC ’09, pp. 2411–2415 (2009)

    Google Scholar 

  48. Lozano, M., Molina, D., Herrera, F.: Editorial: scalability of evolutionary algorithms and other metaheuristics for large-scale continuous optimization problems. Soft Comput. 15(11), 2085–2087 (2011)

    Article  Google Scholar 

  49. MacKay, D.J.C.: Introduction to Gaussian processes. In: Bishop, C.M. (ed.) Neural Networks and Machine Learning, NATO ASI Series, pp. 133–166. Kluwer Academic Press (1998)

    Google Scholar 

  50. Mahdavi, S., Shiri, M.E., Rahnamayan, S.: Metaheuristics in large-scale global continues optimization: a survey. Inf. Sci. 295, 407–428 (2015)

    Article  MathSciNet  Google Scholar 

  51. Martnez, S.Z., Coello Coello, C.A.: A memetic algorithm with non gradient-based local search assisted by a meta-model. In: Schaefer, R., Cotta, C., Koodziej, J., Rudolph, G. (eds.) Parallel Problem Solving from Nature, PPSN XI, Lecture Notes in Computer Science, vol. 6238, pp. 576–585. Springer, Berlin (2010)

    Google Scholar 

  52. Molina, D., Lozano, M., García-Martínez, C., Herrera, F.: Memetic algorithms for continuous optimisation based on local search chains. Evol. Comput. 18(1), 27–63 (2010)

    Article  Google Scholar 

  53. Molina, D., Lozano, M., Herrera, F.: MA-SW-Chains: memetic algorithm based on local search chains for large scale continuous global optimization. In: IEEE Congress on Evolutionary Computation (CEC), 2010, pp. 1–8 (2010)

    Google Scholar 

  54. Molina, D., Lozano, M., Sánchez, A.M., Herrera, F.: Memetic algorithms based on local search chains for large scale continuous optimisation problems: MA-SSW-Chains. Soft Comput. 15(11), 2201–2220 (2011)

    Article  Google Scholar 

  55. Moscato, P.: On evolution, search, optimization, genetic algorithms and martial arts: towards memetic algorithms. Technical Report. Caltech Concurrent Computation Program Report 826, Caltech, Pasadena, California (1989)

    Google Scholar 

  56. Moscato, P.: New ideas in optimization. chap. Memetic Algorithms: A Short Introduction, pp. 219–234. McGraw-Hill Ltd., UK, Maidenhead, UK, England (1999)

    Google Scholar 

  57. Omidvar, M.N., Li, X., Yao, X.: Cooperative co-evolution with delta grouping for large scale non-separable function optimization. In: IEEE Congress on Evolutionary Computation, pp. 1–8 (2010)

    Google Scholar 

  58. Omidvar, M.N., Li, X., Mei, Y., Yao, X.: Cooperative co-evolution with differential grouping for large scale optimization. IEEE Trans. Evol. Comput. 18(3), 378–393 (2014)

    Article  Google Scholar 

  59. Omidvar, M.N., Mei, Y., Li, X.: Effective decomposition of large-scale separable continuous functions for cooperative co-evolutionary algorithms. In: Proceedings of the IEEE Congress on Evolutionary Computatio. IEEE (2014)

    Google Scholar 

  60. Ong, Y.S., Keane, A.: Meta-Lamarckian learning in memetic algorithms. IEEE Trans. Evolut. Comput. 8(2), 99–110 (2004)

    Article  Google Scholar 

  61. Ong, Y., Keane, A., Nair, P.: Surrogate-assisted coevolutionary search. In: Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP ’02, vol. 3, pp. 1140–1145 (2002)

    Google Scholar 

  62. Ong, Y.S., Zhou, Z., Lim, D.: Curse and blessing of uncertainty in evolutionary algorithm using approximation. In: IEEE Congress on Evolutionary Computation, 2006. CEC 2006, pp. 2928–2935 (2006)

    Google Scholar 

  63. Parsopoulos, K.E.: Parallel cooperative micro-particle swarm optimization: a master-slave model. Appl. Soft Comput. 12(11), 3552–3579 (2012)

    Article  Google Scholar 

  64. Peremezhney, N., Hines, E., Lapkin, A., Connaughton, C.: Combining Gaussian processes, mutual information and a genetic algorithm for multi-target optimization of expensive-to-evaluate functions. Eng. Optim. 46(11), 1593–1607 (2014)

    Article  MathSciNet  Google Scholar 

  65. Potter, M.A., De Jong, K.A.: A cooperative coevolutionary approach to function optimization. In: Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: Parallel Problem Solving from Nature, PPSN III, pp. 249–257. Springer (1994)

    Google Scholar 

  66. Potter, M.A., De Jong, K.A.: Cooperative coevolution: an architecture for evolving coadapted subcomponents. Evolut. Comput. 8(1), 1–29 (2000)

    Article  Google Scholar 

  67. Qin, A., Huang, V., Suganthan, P.: Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans. Evolut. Comput. 13(2), 398–417 (2009)

    Article  Google Scholar 

  68. Rasheed, K., Hirsh, H.: Informed operators: speeding up genetic-algorithm-based design optimization using reduced models. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO), pp. 628–635. Morgan Kaufmann (2000)

    Google Scholar 

  69. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press (2005)

    Google Scholar 

  70. Regis, R., Shoemaker, C.: Constrained global optimization of expensive black box functions using radial basis functions. J. Global Optim. 31(1), 153–171 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  71. Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropagation learning: the rprop algorithm. In: IEEE International Conference on Neural Networks, pp. 586–591 (1993)

    Google Scholar 

  72. Schmitz, A., Besnard, E., Vivies, E.: Reducing the cost of computational fluid dynamics optimization using multilayer perceptrons. In: IEEE 2002 World Congress on Computational Intelligence. IEEE (2002)

    Google Scholar 

  73. Sheskin, D.J.: Handbook of Parametric and Nonparametric Statistical Procedures, 4tn edn. Chapman & Hall/CRC (2007)

    Google Scholar 

  74. Snchez-Ante, G., Ramos, F., Frausto, J.: Cooperative simulated annealing for path planning in multi-robot systems. In: MICAI 2000: Advances in Artificial Intelligence. LNCS, vol. 1793, pp. 148–157. Springer, Berlin (2000)

    Google Scholar 

  75. Solis, F.J., Wets, R.J.B.: Minimization by random search techniques. Math. Oper. Res. 6(1), 19–30 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  76. Sun, L., Yoshida, S., Cheng, X., Liang, Y.: A cooperative particle swarm optimizer with statistical variable interdependence learning. Inf. Sci. 186(1), 20–39 (2012)

    Article  MathSciNet  Google Scholar 

  77. Tang, K., Yao, X., Suganthan, P., MacNish, C., Chen, Y., Chen, C., Yang, Z.: Benchmark functions for the CEC’ 2008 special session and competition on large scale global optimization

    Google Scholar 

  78. Tang, K., Li, X., Suganthan, P.N., Yang, Z., Weise, T.: Benchmark functions for the CEC’2010 special session and competition on large-scale global optimization. http://nical.ustc.edu.cn/cec10ss.php

  79. Tang, K., Yang, Z., Weise, T.: Special session on evolutionary computation for large scale global optimization at 2012 IEEE world congress on computational intelligence (cec@wcci-2012). Technical report, Hefei, Anhui, China: University of Science and Technology of China (USTC), School of Computer Science and Technology, Nature Inspired Computation and Applications Laboratory (NICAL) (2012)

    Google Scholar 

  80. Tenne, Y., Armfield, S.: A Memetic Algorithm Assisted by an Adaptive Topology RBF Network and Variable Local Models for Expensive Optimization Problems. INTECH Open Access Publisher (2008)

    Google Scholar 

  81. Tenne, Y., Armfield, S.: Metamodel accuracy assessment in evolutionary optimization. In: IEEE Congress on Evolutionary Computation, 2008. CEC 2008. (IEEE World Congress on Computational Intelligence), pp. 1505–1512 (2008)

    Google Scholar 

  82. Tenne, Y., Armfield, S.: A framework for memetic optimization using variable global and local surrogate models. Soft Comput. 13(8–9), 781–793 (2009)

    Article  Google Scholar 

  83. Trunfio, G.A.: Enhancing the firefly algorithm through a cooperative coevolutionary approach: an empirical study on benchmark optimisation problems. IJBIC 6(2), 108–125 (2014)

    Article  Google Scholar 

  84. Trunfio, G.A.: A cooperative coevolutionary differential evolution algorithm with adaptive subcomponents. Proc. Comput. Sci. 51, 834–844 (2015)

    Article  Google Scholar 

  85. Tseng, L.Y., Chen, C.: Multiple trajectory search for large scale global optimization. In: IEEE Congress on Evolutionary Computation, 2008. CEC 2008. (IEEE World Congress on Computational Intelligence), pp. 3052–3059 (2008)

    Google Scholar 

  86. Ulmer, H., Streichert, F., Zell, A.: Evolution strategies assisted by Gaussian processes with improved pre-selection criterion. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 692–699 (2003)

    Google Scholar 

  87. Van den Bergh, F., Engelbrecht, A.P.: A cooperative approach to particle swarm optimization. IEEE Trans. Evol. Comput. 8(3), 225–239 (2004)

    Google Scholar 

  88. Wang, Y., Huang, J., Dong, W.S., Yan, J.C., Tian, C.H., Li, M., Mo, W.T.: Two-stage based ensemble optimization framework for large-scale global optimization. Eur. J. Oper. Res. 228(2), 308–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  89. Wang, Y., Li, B.: Two-stage based ensemble optimization for large-scale global optimization. In: IEEE Congress on Evolutionary Computation (CEC), 2010, pp. 1–8 (2010)

    Google Scholar 

  90. Willmes, L., Baeck, T., Jin, Y., Sendhoff, B.: Comparing neural networks and kriging for fitness approximation in evolutionary optimization. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 663–670 (2003)

    Google Scholar 

  91. Won, K., Ray, T., Tai, K.: A framework for optimization using approximate functions. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 1077–1084 (2003)

    Google Scholar 

  92. Yang, Z., Tang, K., Yao, X.: Large scale evolutionary optimization using cooperative coevolution. Inf. Sci. 178(15), 2985–2999 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  93. Yang, Z., Tang, K., Yao, X.: Multilevel cooperative coevolution for large scale optimization. In: IEEE Congress on Evolutionary Computation, pp. 1663–1670. IEEE (2008)

    Google Scholar 

  94. Yang, Z., Tang, K., Yao, X.: Self-adaptive differential evolution with neighborhood search. In: IEEE Congress on Evolutionary Computation, 2008. CEC 2008. (IEEE World Congress on Computational Intelligence), pp. 1110–1116 (2008)

    Google Scholar 

  95. Yang, Z., Tang, K., Yao, X.: Scalability of generalized adaptive differential evolution for large-scale continuous optimization. Soft Comput. 15(11), 2141–2155 (2011)

    Article  Google Scholar 

  96. Zhang, J., Sanderson, A.: Jade: Adaptive differential evolution with optional external archive. IEEE Trans. Evol. Comput. 13(5), 945–958 (2009)

    Article  Google Scholar 

  97. Zhang, Q., Liu, W., Tsang, E., Virginas, B.: Expensive multiobjective optimization by moea/d with Gaussian process model. IEEE Trans. Evol. Comput. 14(3), 456–474 (2010)

    Article  Google Scholar 

  98. Zhou, Z., Ong, Y., Lim, M., Lee, B.: Memetic algorithm using multi-surrogates for computationally expensive optimization problems. Soft Comput. 11(10), 957–971 (2007)

    Article  Google Scholar 

  99. Zhou, Z., Ong, Y.S., Nair, P.B., Keane, A.J., Lum, K.Y.: Combining global and local surrogate models to accelerate evolutionary optimization. IEEE Trans. Syst. Man Cybern. Part C 37(1), 66–76 (2007)

    Article  Google Scholar 

  100. Zhou, Z., Ong, Y.S., Nguyen, M.H., Lim, D.: A study on polynomial regression and Gaussian process global surrogate model in hierarchical surrogate-assisted evolutionary algorithm. In: The 2005 IEEE Congress on Evolutionary Computation, 2005, vol. 3, pp. 2832–2839 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giuseppe A. Trunfio .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Trunfio, G.A. (2016). Enhancing Cooperative Coevolution with Surrogate-Assisted Local Search. In: Yang, XS. (eds) Nature-Inspired Computation in Engineering. Studies in Computational Intelligence, vol 637. Springer, Cham. https://doi.org/10.1007/978-3-319-30235-5_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-30235-5_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-30233-1

  • Online ISBN: 978-3-319-30235-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics