Skip to main content
Log in

Adaptive Information Granulation in Fitness Estimation for Evolutionary Optimization

  • Published:
Wireless Personal Communications Aims and scope Submit manuscript

Abstract

Evolutionary algorithms ordinarily need to conduct lots of fitness evaluations, requiring big computational overhead particularly in complex optimization problems. This paper proposes an adaptive information granulation approach which inspired from on the granule computing, and then reduces the expensive original fitness evaluation by the aid of the fitness inheritance strategy based on the proposed adaptive information granulation approach. The proposed algorithm is compared with few fitness inheritance assisted evolutionary algorithm on both traditional benchmark problems with four different dimensions, the CEC 2013 functions and the CEC 2014 expensive optimization test problems with 30 dimensions. Experimental results show both high effectiveness and efficiency with better solutions than those compared algorithm within different finite budget of computation for different benchmark problems. Its advantages are further verified by a real-world light aircraft wing design problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Soltani, S., & Murch, R. D. (2015). A compact planar printed MIMO antenna design. IEEE Transactions on Antennas and Propagation, 63(3), 1140–1149.

    Article  MathSciNet  Google Scholar 

  2. Regis, R. G. (2014). Evolutionary programming for high-dimensional constrained expensive black-box optimization using radial basis functions. IEEE Transactions on Evolutionary Computation, 18(3), 326–347.

    Article  Google Scholar 

  3. Ong, Y. S., Nair, P. B., & Keane, A. J. (2003). Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA Journal, 41(4), 687–696.

    Article  Google Scholar 

  4. Shan, S., & Wang, G. G. (2010). Survey of modeling and optimization strategies to solve high-dimensional design problems with computationally-expensive black-box functions. Structural & Multidisciplinary Optimization, 41(2), 219–241.

    Article  MathSciNet  Google Scholar 

  5. Gu, L. (2001). A comparison of polynomial based regression models in vehicle safety analysis. ASME Design Engineering Technical Conferences. ASME Paper No.: DETC/DAC-21083.2001.

  6. Jin, Y. (2005). A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing, 9(1), 3–12.

    Article  MathSciNet  Google Scholar 

  7. Jin, Y. (2011). Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm & Evolutionary Computation, 1(2), 61–70.

    Article  Google Scholar 

  8. Shi, L., & Rasheed, K. (2010). A survey of fitness approximation methods applied in evolutionary algorithms. In: Computational intelligence in expensive optimization problems, pp. 3–28.

    Chapter  Google Scholar 

  9. Wang, H., Olhofer, M., & Jin, Y. (2017). A mini-review on preference modeling and articulation in multi-objective optimization: current status and challenges. Complex & Intelligent Systems. https://doi.org/10.1007/s40747-017-0053-9.

    Article  Google Scholar 

  10. Lim, D., Ong, Y. S., Jin, Y., & Sendhoff, B. (2007). A study on metamodeling techniques, ensembles, and multi-surrogates in evolutionary computation. In Genetic and evolutionary computation conference, GECCO 2007, proceedings, London, England, Uk, July, 2007 (pp. 1288–1295).

  11. Wang, H., Jin, Y., & Doherty, J. (2017). Committee-based active learning for surrogate-assisted particle swarm optimization of expensive problems. IEEE Transactions on Cybernetics, 47(9), 2664–2677. https://doi.org/10.1109/TCYB.2017.2710978.

    Article  Google Scholar 

  12. Liu, B., Zhang, Q., & Gielen, G. G. E. (2014). A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems. IEEE Transactions on Evolutionary Computation, 18(2), 180–192.

    Article  Google Scholar 

  13. Snoek, J., Rippel, O., Swersky, K., Kiros, R., Satish, N., Sundaram, N., & Adams, R. (2015). Scalable bayesian optimization using deep neural networks. In International Conference on Machine Learning (pp. 2171–2180).

  14. Ferrari, S., & Stengel, R. F. (2005). Smooth function approximation using neural networks. IEEE Transactions on Neural Networks, 16(1), 24–38.

    Article  Google Scholar 

  15. Sun, C., Jin, Y., Zeng, J., & Yu, Y. (2015). A two-layer surrogate-assisted particle swarm optimization algorithm. Soft Computing, 19(6), 1461–1475. https://doi.org/10.1007/s00500-014-1283-z.

    Article  Google Scholar 

  16. Stramacchia, M., Toal, D., & Keane, A. (2016). Improving the optimisation performance of an ensemble of radial basis functions. Engopt 2016-, International Conference on Engineering Optimization.

  17. Deb, K., Hussein, R., Roy, P., & Toscano, G. Classifying metamodeling methods for evolutionary multi-objective optimization: First results. In International conference on evolutionary multi-criterion optimization, 2017 (pp. 160–175).

    Google Scholar 

  18. Shi, L., & Rasheed, K. (2010). A survey of fitness approximation methods applied in evolutionary algorithms. In Computational intelligence in expensive optimization problems (pp. 3–28). Berlin: Springer.

    Chapter  Google Scholar 

  19. Smith, R. E., Dike, B. A., & Stegmann, S. Fitness inheritance in genetic algorithms. In Proceedings of the 1995 ACM symposium on Applied computing, 1995 (pp. 345–350). ACM.

  20. Salami, M., & Hendtlass, T. (2003). A fast evaluation strategy for evolutionary algorithms. Applied Soft Computing, 2(3), 156–173.

    Article  Google Scholar 

  21. Sun C, Z. J., Pan J, et al. A new fitness estimation strategy for particle swarm optimization. Information Sciences, 221(2).

    Article  MathSciNet  Google Scholar 

  22. Cui, Z., Zeng, J., & Sun, G. (2006). A fast particle swarm optimization. International Journal of Innovative Computing, Information and Control, 2(6), 1365–1380.

    Google Scholar 

  23. Sun, C., Zeng, J., Pan, J., & Jin, Y. Similarity-based evolution control for fitness estimation in particle swarm optimization. In Computational intelligence in dynamic and uncertain environments (CIDUE), 2013 IEEE symposium on, 1619 April 2013 2013 (pp. 1–8). https://doi.org/10.1109/cidue.2013.6595765.

  24. Kim, H.-S., & Cho, S.-B. An efficient genetic algorithm with less fitness evaluation by clustering. In Evolutionary computation, 2001. Proceedings of the 2001 congress on, 2001 (Vol. 2, pp. 887–894): IEEE.

  25. Reyes-Sierra, M., & Coello, C. A. C. A study of fitness inheritance and approximation techniques for multi-objective particle swarm optimization. In Evolutionary computation, 2005. The 2005 IEEE congress on, 2005 (Vol. 1, pp. 65–72). IEEE.

  26. Gomide, F. Fuzzy clustering in fitness estimation models for genetic algorithms and applications. In Fuzzy systems, 2006 IEEE International Conference on, 2006 (pp. 1388–1395). IEEE.

  27. Fonseca, L., Barbosa, H., & Lemonge, A. (2009). A similarity-based surrogate model for enhanced performance in genetic algorithms. Opsearch, 46(1), 89–107.

    Article  MathSciNet  Google Scholar 

  28. Fonseca, L. G., Lemonge, A. C., & Barbosa, H. J. A study on fitness inheritance for enhanced efficiency in real-coded genetic algorithms. In Evolutionary computation (CEC), 2012 IEEE Congress on, 2012 (pp. 1–8). IEEE.

  29. Jin, Y., & Sendhoff, B. (2004). Reducing fitness evaluations using clustering techniques and neural network ensembles. In K. Deb (Ed.), Genetic and evolutionary computationGECCO 2004: Genetic and evolutionary computation conference, Seattle, WA, USA, June 26-30, 2004. Proceedings, Part I (pp. 688–699). Berlin, Heidelberg: Springer.

    Chapter  Google Scholar 

  30. Sun, Y., Halgamuge, S. K., Kirley, M., & Munoz, M. A. On the selection of fitness landscape analysis metrics for continuous optimization problems. In Information and automation for sustainability (ICIAfS), 2014 7th international conference on, 2014 (pp. 1–6): IEEE.

  31. Jones, T., & Forrest, S. Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In ICGA, 1995 (vol. 95, pp. 184–192).

  32. Davarynejad, M., Ahn, C., Vrancken, J., van den Berg, J., & Coello, C. C. (2010). Evolutionary hidden information detection by granulation-based fitness approximation. Applied Soft Computing, 10(3), 719–729.

    Article  Google Scholar 

  33. Cruz-Vega, I., Garcia-Limon, M., & Escalante, H. J. Adaptive-surrogate based on a neuro-fuzzy network and granular computing. In Proceedings of the 2014 conference on Genetic and evolutionary computation, 2014 (pp. 761–768). ACM.

  34. Cruz-Vega, I., Escalante, H. J., Reyes, C. A., Gonzalez, J. A., & Rosales, A. (2016). Surrogate modeling based on an adaptive network and granular computing. Soft Computing, 20(4), 1549–1563.

    Article  Google Scholar 

  35. Jie, T., Ying, T., Chaoli, S., Jianchao, Z., & Jin, Y. A self-adaptive similarity-based fitness approximation for evolutionary optimization. In 2016 IEEE symposium series on computational intelligence (SSCI), 6-9 Dec. 2016 2016 (pp. 1–8). https://doi.org/10.1109/ssci.2016.7850209.

  36. Shehata, R. H., Mekhamer, S. F., El-Sherif, N., & Badr, M. A. L. (2014). Particle swarm optimization: Developments and application fields. International Journal of Energy and Power Engineering, 5(1), 437–449.

    Google Scholar 

  37. Sun, C., Zeng, J., Pan, J., Xue, S., & Jin, Y. (2013). A new fitness estimation strategy for particle swarm optimization. Information Sciences, 221, 355–370.

    Article  MathSciNet  Google Scholar 

  38. Liang, J., Qu, B., Suganthan, P., & Hernández-Díaz, A. G. (2013). Problem definitions and evaluation criteria for the CEC 2013 special session on real-parameter optimization. Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Nanyang Technological University, Singapore, Technical Report, 201212.

  39. Forrester, A., & Keane, A. (2008). Engineering design via surrogate modelling: a practical guide. London: Wiley.

    Book  Google Scholar 

Download references

Acknowledgements

The authors acknowledge the National Natural Science Foundation of China (Grant Nos. 61403272 and 61472269).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ying Tan.

Appendix

Appendix

For PSO, each particle i has one speed \({\text{V}}_{i} = [v_{i1} ,v_{i2} , \ldots ,v_{iD} ]\) and a position \({\text{X}}_{i} = [x_{i1} ,x_{i2} , \ldots ,x_{iD} ]\) during the search of the D-dimensional hyperspace. The vectors Vi and Xi are randomly initialized and updated by (16) and (17) under the guidance of the Pi, and the Pn:

$${\text{V}}_{\text{i}}^{{({\text{t + 1}})}} = \omega {\text{V}}_{i}^{(t)} + c_{1} r_{1} ({\text{P}}_{i}^{(t)} - {\text{X}}_{i}^{(t)} ) + c_{2} r_{2} ({\text{P}}_{\text{n}}^{(t)} - {\text{X}}_{i}^{(t)} )$$
(16)
$${\text{X}}_{i}^{(t + 1)} {\text{ = X}}_{i}^{(t)} {\text{ + V}}_{i}^{(t + 1)}$$
(17)

where, \({\text{X}}_{i}^{(t)}\) and \({\text{V}}_{i}^{(t)}\) are i particle`s position and speed in generation t, respectively, coefficient ω represents inertia weight, c1 and c2 as two acceleration parameters. r1 and r2 refer to two diagonal matrices, and diagonal elements are consistently generated with a random number in scope of [0, 1].

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tian, J., Zeng, J., Tan, Y. et al. Adaptive Information Granulation in Fitness Estimation for Evolutionary Optimization. Wireless Pers Commun 103, 741–759 (2018). https://doi.org/10.1007/s11277-018-5474-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11277-018-5474-2

Keywords

Navigation