Abstract
This paper proposes a new approach named SGMIEC in the field of estimation of distribution algorithm (EDA). While the current EDAs require much time in the statistical learning process as the relationships among the variables are too complicated, the selfish gene theory (SG) is deployed in this approach and a mutual information and entropy based cluster (MIEC) model with an incremental learning and resample scheme is also set to optimize the probability distribution of the virtual population. Experimental results on several benchmark problems demonstrate that, compared with BMDA, COMIT and MIMIC, SGMIEC often performs better in convergent reliability, convergent velocity and convergent process.
Similar content being viewed by others
References
Ahn CW, Ramakrishna RS (2008) On the scalability of real-coded bayesian optimization algorithm. IEEE Trans Evol Comput 12(3):307–322
Baluja S (1994) Population-based incremental learning: a method for integrating genetic search based function optimization and competitive learning. Technical Report CMU-CS-94-163, Carnegie Mellon University, Pittsburgh
Baluja S, Davies S (1997) Using optimal dependency-trees for combinational optimization. In: ICML ’97: Proceedings of the fourteenth international conference on machine learning, San Francisco, pp 30–38
Baluja S, Davies S (1998) Fast probabilistic modeling for combinatorial optimization. In: Proceedings of 15th national conference on artificial intelligence (AAAI), pp 469–476
Bonet J, Isbell CL, Viola P (1997) Mimic: finding optima by estimating probability densities. In: Advances in neural information processing systems, vol 9. MIT Press, Cambridge, pp 424–430
Corno F, Reorda MS, Squillero G (1998a) A new evolutionary algorithm inspired by the selfish gene theory. In ICEC’98: IEEE international conference on evolutionary computation, pp 575–580
Corno F, Reorda M, Squillero G (1998b) The selfish gene algorithm: a new evolutionary optimization strategy. In SAC98: 13th annual ACM symposium on applied computing, Atlanta, pp 349–355
Cover TM, Thomas JA (2006) Elements of information theory, 2nd edn. Wiley series in telecommunications and signal processing. Wiley, New York
Dawkins R (1989) The selfish gene—new edition. Oxford University Press, Oxford
Harik G (1999) Linkage learning via probabilistic modeling in the ecga. Technical report, University of Illinois at Urbana-Champaign
Harik GR, Lobo FG, Goldberg DE (1999) The compact genetic algorithm. IEEE Trans Evol Comput 3(4):287–297
Harik GR, Lobo FG, Sastry K (2006) Linkage learning via probabilistic modeling in the extended compact genetic algorithm(ecga). In: Scalable optimization via probabilistic modeling, pp 39–61
Hong Y, Kwong S, Wang H, Xie ZH, Ren Q (2008) Svpcga: Selection on virtual population based compact genetic algorithm. In: IEEE Congress on evolutionary computation, pp 265–272
Larranaga P, Lozano J (2002) Estimation of distribution algorithms: a new tool for evolutionary computation. Kluwer, Boston
Muhlenbein H, Paass G (1996) From recombination of genes to the estimation of distributions i. binary parameters. In PPSN IV: Proceedings of the 4th international conference on parallel problem solving from nature, London, pp 178–187
Pelikan M, Muhlenbein H (1998) Marginal distribution in evolutionary algorithms. In In Proceedings of the international conference on genetic algorithms Mendel’98, pp 90–95
Pelikan M, Muhlenbein H (1999) The bivariate marginal distribution algorithm. In: Advances in soft computing: engineering design and manufacturing. Springer, London, pp 521–535
Pelikan M, Goldberg DE, Cantu-Paz E (1999) Boa: the bayesian optimization algorithm. In Proceedings of the genetic and evolutionary computation conference (GECCO-99). Morgan Kaufmann, pp 525–532
Yang S, Yao X (2008) Population-based incremental learning with associative memory for dynamic environments. IEEE Trans Evol Comput 12(5):542–561
Yang SY, Ho SL, Ni GZ, Machado JM, Wong KF (2007) A new implementation of population based incremental learning method for optimizations in electromagnetics. IEEE Trans Mag 43(4):1601–1604
Yu T-L, Goldberg DE (2004) Dependency structure matrix analysis: offline utility of the dependency structure matrix genetic algorithm. In GECCO (2), pp 355–366
Yu T-L, Sastry K, Goldberg DE, Pelikan M (2007) Population sizing for entropy-based model building in discrete estimation of distribution algorithms. In GECCO, pp 601–608
Zhang Q, Zhou A, Jin Y (2008) Rm-meda: A regularity model-based multiobjective estimation of distribution algorithm. IEEE Trans Evol Comput 12(1):41–63
Zhou SD, Sun ZQ (2007) A survey on estimation of distribution algorithms. Acta Autom Sin 33(2):113–124
Zhou A, Zhang Q, Jin Y, Sendhoff B (2008) Combination of eda and de for continuous biobjective optimization. In: IEEE congress on evolutionary computation, pp 1447–1454
Acknowledgments
This work was supported by the Research Project of Wuhan University under Grant 6082018.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
The first benchmark problem is the One-Max:
The second benchmark problem is the Weighed One-Max:
The third benchmark problem is the four peaks:
Given an N-dimensional input vector X , the four peaks evaluation function is defined as:
where
In all trails, T was set to be 10% of N, the size of the problem.
The fourth benchmark problem is the trap problem: The general k-bit trap functions are defined as
where b i is in 0,1,\( u = \sum\nolimits_{i = 1}^k {b{}_i} \) and f high > f low.Usually, f high is set at k and f low is set at k − 1. The trap functions denoted by \( F_{m {\times}k} \) are defined as
The m and k are varied to produce a number of test functions. In all trails, k was set to be 5.
The fifth benchmark problem is the Satisfaction problem:
where f(x 5i−4 , x 5i−3 , x 5i−2 , x 5i−1 , x 5i ) equals to 5 if and only if all variables equal to 1. Otherwise it equals to 0.
The six benchmark problem is the deceptive-3 problem:
where
and
This problem is a hard deceptive problem which has a large number of local optimal solutions.
Rights and permissions
About this article
Cite this article
Wang, F., Lin, Z., Yang, C. et al. Using selfish gene theory to construct mutual information and entropy based clusters for bivariate optimizations. Soft Comput 15, 907–915 (2011). https://doi.org/10.1007/s00500-010-0557-3
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-010-0557-3