Skip to main content

Advertisement

Log in

A novel multi-objective evolutionary algorithm based on LLE manifold learning

  • Original Article
  • Published:
Engineering with Computers Aims and scope Submit manuscript

Abstract

In recent years, the LLE manifold learning algorithm is innovatively introduced to the multi-objective evolutionary algorithm to solve continuous multi-objective problems. It is according to the regularity that the Pareto set of a continuous multi-objective problem is a piece-wise continuous (m − 1)-dimensional manifold, and m is the objective number. The goal of the LLE manifold learning algorithm is to reduce the dimension of data. However, the existing LLE-based algorithm directly applies LLE manifold learning algorithm to the MOP modeling for individual reproduction. It has the following weaknesses: (1) the (d − 1)-dimensional manifold, which is constructed by refactoring coefficient, is not necessarily the manifold of the optimal solution space; (2) when resampling, the original information of samples is basically lost, especially in the case of d = 2, only keeping the scope of a linear interval. The distribution of the original samples information is completely lost and the cost of repeated calculation is higher; (3) the neighborhood relationship of resampling does not mean the neighborhood relationship of samples in PS space. So, we propose a new LLE modeling algorithm which is called O2O-LLE approach. The new O2O-LLE approach is inspired by LLE manifold learning idea and makes full use of the mapping function known in the MOP that the decision space is considered as the high-dimensional space and the object space is regarded as a low-dimensional space. Thus, the new modeling algorithm is no longer to build the overall low-dimensional space of the sample and then reflected back to the high-dimensional space, but it replaces directly constructing new samples in high-dimensional space. Thereby, the above four weaknesses are effectively avoided. Its steps are as follows: (1) mapping the samples from PS space to PF space; (2) searching K neighbors in PF space; (3) calculating LLE refactoring coefficient according to the K neighbors in PF space; (4) producing offspring sample according to the refactoring coefficient in PS space. Also, different from the early algorithm framework HMOEDA_LLE, the new algorithm framework O2O-LLE-RM does not include the genetic operation, so its efficiency is improved. To verify the performance of O2O-LLE-RM, several widely used test problems are employed to conduct the comparison experiments with three state-of-the-art multi-objective evolutionary algorithms: NSGA-II, RM-MEDA and Firefly. The simulated results show that the proposed algorithm has better optimization performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Srinivas N, Deb K (1994) Multi-objective optimization using non-dominated sorting in genetic algorithms. Evol Comput 2:221–248

    Article  Google Scholar 

  2. Murata T, Ishibuchi H (1995) MOGA: multi-objective genetic algorithms. In: Proceedings of the 2nd IEEE international conference on evolutionary computing. IEEE, vol 2, pp 289–294

  3. Zitzler E, Thiele L (1999) Multi-objective evolutionary algorithms: a comparative case study and the strength Pareto approach. IEEE Trans Evol Comput 3:257–271

    Article  Google Scholar 

  4. Deb K, Pratap A, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. Evol Comput 6:182–197

    Article  Google Scholar 

  5. Zitzler E, Laumanns M, Thiele L (2001) SPEA2: improving the strength Pareto evolutionary algorithm. Technical report 103. Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH) Zutich,Gloriastrasse 35, CH-8092 Zurich, Switzerland

  6. Zhou A, Zhang Q, Jin Y, Tsang E, Okabe T (2005) A model-based evolutionary algorithm for bi-objective optimization. In: Congress on evolutionary computation. IEEE, pp 2568–2575

  7. Zhou A, Jin Y, Zhang Q, Sendhoff B, Tsang E (2006) Combining model-based and genetics-based offspring generation for multi-objective optimization using a convergence criterion. In: Congress on evolutionary computation. IEEE, pp 3234–3241

  8. Zhou A, Zhang Q, Jin Y, Sendhoff B, Tsang E (2006) Modelling the population distribution in multiobjective optimization by generative topographic mapping. In: Parallel problem solving from nature-PPSN IX. Springer, Berlin, pp 443–452

  9. Zhang Q, Zhou A, Jin Y (2008) RM-MEDA: a regularity model-based multiobjective estimation of distribution algorithm. IEEE Trans Evol Comput 12:41–63

    Article  Google Scholar 

  10. Zhou A, Zhang Q, Jin Y (2009) Approximating the set of Pareto-optimal solutions in both the decision and objective spaces by an estimation of distribution algorithm. IEEE Trans Evol Comput 13:1167–1189

    Article  Google Scholar 

  11. Yuan Q, Dai G (2015) Research on the features of NSGA-II and RM-MEDA. J Comput Inf Syst 11:6735–6745

    Google Scholar 

  12. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326

    Article  Google Scholar 

  13. Chen J, Liu Y (2011) Locally linear embedding: a survey. Artif Intell Rev 36:29–48

    Article  Google Scholar 

  14. Chahooki MAZ, Charkari NM (2014) Shape classification by manifold learning in multiple observation spaces. Inf Sci 262:46–61

    Article  MATH  Google Scholar 

  15. Chahooki MAZ, Charkari NM (2014) Unsupervised manifold learning based on multiple feature spaces. Mach Vis Appl 25:1053–1065

    Article  MATH  Google Scholar 

  16. Hettiarachchi R, Peters JF (2015) Multi-manifold LLE learning in pattern recognition. Pattern Recognit 48:2947–2960

    Article  Google Scholar 

  17. Zhang Y, Dai* G et al (2014) HMOEDA_LLE: a hybrid multi-objective estimation of distribution algorithm combining locally linear embedding. In: IEEE congress on evolutionary computation, 2014. CEC’14, IEEE, pp 707–714

  18. Yang XS (2013) Multiobjective firefly algorithm for continuous optimization. Eng Comput 29:175–184

    Article  Google Scholar 

  19. Ding Shifei, Fulin Wu, Nie Ru, Junzhao Yu, Huang Huajuan (2013) Twin support vector machines based on quantum particle swarm optimization. J Softw 8:1743–1750

    Google Scholar 

  20. Ding S, Yu J, Huang H, Zhao H (2013) Twin support vector machines based on particle swarm optimization. J Comput 8:2296–2303

    Google Scholar 

  21. Ding S, Zhang Y, Chen J, Jia W (2013) Research on using genetic algorithms to optimize Elman neural networks. Neural Comput Appl 23:293–297

    Article  Google Scholar 

  22. Ding S, Zhang X, Yu J (2015) Twin support vector machines based on fruit fly optimization algorithm. Int J Mach Learn Cybern. doi:10.1007_s13042-015-0424-8

  23. Yang X-S, Deb S (2013) Multiobjective cuckoo search for design optimization. Comput Oper Res 40:1616–1624

    Article  MathSciNet  MATH  Google Scholar 

  24. Fister I, Yang X-S, Brest J (2013) Modified firefly algorithm using quaternion representation. Expert Syst Appl 40:7220–7230

    Article  Google Scholar 

  25. Yang X-S (2014) Swarm intelligence based algorithms: a critical analysis. Evol Intell 7:17–28

    Article  Google Scholar 

  26. Gandomia AH, Yang X-S (2014) Chaotic bat algorithm. J Comput Sci 5:224–232

    Article  MathSciNet  Google Scholar 

  27. Cayton L (2005) Algorithms for manifold learning. University of California, San Diego. Tech. Rep. CS2008-0923

  28. Seung HS, Lee DD (2000) The manifold ways of perception. Science 290:2268–2269

    Article  Google Scholar 

  29. Tenenbaum JB, De Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290:2319–2323

    Article  Google Scholar 

  30. Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15:1373–1396

    Article  MATH  Google Scholar 

  31. Donoho DL, Grimes C (2003) Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proc Natl Acad Sci 100:5591–5596

    Article  MathSciNet  MATH  Google Scholar 

  32. Zhang Z, Zha H (2005) Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J Sci Comput 26:313–338

    Article  MathSciNet  MATH  Google Scholar 

  33. Laumanns M, Ocenasek J (2002) Bayesian optimization algorithms for multi-objective optimization. Parallel problem solving from nature PPSN VII. Springer, Berlin, pp 298–307

  34. Okabe T, Jin Y, Sendoff B et al (2004) Voronoi-based estimation of distribution algorithm for multi-objective optimization. In: Congress on evolutionary computation, 2004. CEC2004. IEEE, vol 2. pp 1594–1601

  35. Bosman PAN, Thierens D (2005) The naive MIDEA: a baseline multi-objective EA. In: Evolutionary multi-criterion optimization: Third international conference, EMO 2005, Guanajuato, Mexico, March 9–11, 2005, Proceedings, vol 3410. Springer, p 428

  36. Pelikan M, Sastry K, Goldberg DE (2005) Multiobjective hBOA, clustering, and scalability. In: Proceedings of the 2005 conference on genetic and evolutionary computation. ACM, pp 663–670

  37. Dong W, Yao X (2008) Unified eigen analysis on multivariate Gaussian based estimation of distribution algorithms. Inf Sci 178:3000–3023

    Article  MathSciNet  MATH  Google Scholar 

  38. Zhang D, Gong X, Dai G (2011) Multi-objective evolutionary algorithm for principal curve model based on multifractal. J Comput Res Dev 48:1729–1739

    Google Scholar 

  39. Farhang-Mehr A, Azarm S (2002) Diversity assessment of Pareto optimal solution sets: an entropy approach. In: Proceedings of the 2002 congress on evolutionary computation, 2002. CEC’02. IEEE, vol. 1. pp 723–728

  40. Gunawan S, Farhang-Mehr A, Azarm S (2003) Multi-level multi-objective genetic algorithm using entropy to preserve diversity. In: Fonseca CM, Fleming PJ, Zitzler E, Thiele L, Deb K (eds) Evolutionary multi-criterion optimization. Springer, Berlin, pp 148–161

  41. Ocenasek J (2006) Entropy-based convergence measurement in discrete estimation of distribution algorithms. Towards a new evolutionary computation. Springer, Berlin, pp 39–50

  42. Wang YN, Wu LH, Yuan XF (2010) Multi-objective self-adaptive differential evolution with elitist archive and crowding entropy-based diversity measure. Soft Comput 14:193–209

    Article  Google Scholar 

  43. Qin Y, Ji J, Liu C (2012) An entropy-based multiobjective evolutionary algorithm with an enhanced elite mechanism. Appl Comput Intell Soft Comput 2012:91–96

    Article  Google Scholar 

  44. Wang LL, Chen Y (2012) Diversity based on entropy: a novel evaluation criterion in multi-objective optimization algorithm. Int J Intell Syst Appl (IJISA) 4:113

    Google Scholar 

  45. Li H, Zhang Q (2009) Multiobjective optimization problems with complicated Pareto sets, MOEA/D and NSGA-II. IEEE Trans Evol Comput 13:284–302

    Article  Google Scholar 

  46. Bosman T, Thierens D (2003) The balance between proximity and diversity in multi objective evolutionary algorithms. IEEE Trans Evol Comput 7:174–188

    Article  Google Scholar 

  47. Sheskin DJ (2006) Handbook of parametric and nonparametric statistical procedures, 4th edn [s. l.]. Chapman & Hall/CRC, London

  48. Zhao H, Li M, Weng X, Zhou H (2015) Performance evaluation for biology-inspired optimization algorithms based on nonparametric statistics. J Air Force Eng Univ (Nat Sci Edn) 1:89–94

    Google Scholar 

Download references

Acknowledgments

This paper is supported by National Natural Science Foundation of China (Nos. 61472375, 61103144, 61379101).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guangming Dai.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yuan, Q., Dai, G. & Zhang, Y. A novel multi-objective evolutionary algorithm based on LLE manifold learning. Engineering with Computers 33, 293–305 (2017). https://doi.org/10.1007/s00366-016-0473-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00366-016-0473-y

Keywords

Navigation