Elsevier

Information Sciences

Volume 177, Issue 2, 15 January 2007, Pages 632-654
Information Sciences

A real-coding jumping gene genetic algorithm (RJGGA) for multiobjective optimization

https://doi.org/10.1016/j.ins.2006.07.019Get rights and content

Abstract

This paper presents a real jumping gene genetic algorithm (RJGGA) as an enhancement of the jumping gene genetic algorithm (JGGA) [T.M. Chan, K.F. Man, K.S. Tang, S. Kwong, A jumping gene algorithm for multiobjective resource management in wideband CDMA systems, The Computer Journal 48 (6) (2005) 749–768; T.M. Chan, K.F. Man, K.S. Tang, S. Kwong, Multiobjective optimization of radio-to-fiber repeater placement using a jumping gene algorithm, in: Proceedings of the IEEE International Conference on Industrial Technology (ICIT 2005), Hong Kong, 2005, pp. 291–296; K.F. Man, T.M. Chan, K.S. Tang, S. Kwong, Jumping-genes in evolutionary computing, in: Proceedings of the IEEE IECON’2004, Busan, 2004, pp. 1268–1272]. JGGA is a relatively new multiobjective evolutionary algorithm (MOEA) that imitates a jumping gene phenomenon discovered by Nobel Laureate McClintock during her work on the corn plants. The main feature of JGGA is that it only has a simple operation in which a transposition of gene(s) is induced within the same or another chromosome in the genetic algorithm (GA) framework. In its initial formulation, the search space solutions are binary-coded and it inherits the customary problems of conventional binary-coded GA (BCGA). This issue motivated us to remodel the JGGA into RJGGA. The performance of RJGGA has been compared to other MOEAs using some carefully chosen benchmark test functions. It has been observed that RJGGA is able to generate non-dominated solutions with a wider spread along the Pareto-optimal front and better address the issues regarding convergence and diversity in multiobjective optimization.

Introduction

In the world around us, there are few problems concerned with a single value or objective. Instead, most problems present multiple objectives that often conflict with each other. These objectives have to be met or optimized before any adequate solution is reached. The problems having two or more objective functions are called “multiobjective” problems, and it is rarely the case that a single solution can simultaneously satisfy all existing objectives. Usually, the trade-off among such conflicting objectives is a must. In fact, the notion of “optimality” changes when dealing with multiobjective optimization problems. This notion was originally introduced by Edgeworth [18] and later generalized by Pareto [44]. It is called an Edgeworth–Pareto optimum or, simply a Pareto-optimum.

Evolutionary multiobjective optimization has attracted significant attention from researchers in various fields due to its effectiveness and robustness in searching a set of trade-off solutions. In addition, its real-world application has become increasingly popular in the last few years [5], [7], [23], [31], [32], [36], [40], [54], [55]. Unlike conventional methods that aggregate multiple attributes to form a composite scalar objective function, MOEAs are capable of considering each objective component separately and guiding the search to discover the global Pareto-optimal front. The main challenges in a multiobjective optimization environment are to minimize the distance of the generated solutions to the Pareto set, and to maintain the diversity of the developed Pareto set. An MOEA will be considered “good” only if both goals are satisfied simultaneously. The potential of evolutionary algorithms to solve multiobjective optimization problems was hinted at in the late 1960s by Rosenberg [46]. The first implementation of MOEA dates to the mid-1980s as VEGA [48]. Since then, a considerable amount of research has been done in this area. Comprehensive overviews and discussions can be found on several recent surveys [5], [6], [9], [55]. Most of them are motivated by Goldberg’s suggestion of non-dominated GA [24]. A representative collection of these algorithms includes HLGA [27], MOGA [21], [22], NPGA [34], SPEA [65], PAES [38], [39], NSGAII [16], etc.

In this paper, a new evolutionary algorithm, real jumping gene genetic algorithm (RJGGA) is proposed as an improvement on JGGA. JGGA incorporates the concept of jumping gene phenomenon discovered by Nobel Laureate McClintock in her work on the corn plants [20]. The main feature of JGGA is that it consists of a simple operation in which a transposition of gene(s) is induced within the same or another chromosome under the GA framework. The niche of this system is that it enables gene mobility within the same, or even to the neighboring, chromosome(s) in its search for appropriate non-dominated solutions in the context of multiobjective optimization problems. In conventional evolutionary computing, a chromosome is weighted by the fitness function, and it can evolve from generation to generation when the fitness function is evaluated. For a single fitness function, no individual gene in the chromosome can be excited consistently during the entire evolutionary process. As a result, a premature convergence often results because the chromosome or its genes have not had severe enough stress to enable the genes to jump to another more stable position.

The power and success of GA are mostly achieved through the diversity of the individuals in a population that evolved under the principle of “the survival of the fittest” [51]. This is accomplished by using classical genetic crossover and mutation operators. In fact, the genetic diversity of individuals can also be achieved by other mechanisms, such as gene insertion, duplication or movement. In this respect, Mitchell and Forrest pointed out the importance of studying other mechanisms for rearranging genetic material (e.g., jumping genes, gene deletion duplication, introns and exons) to understand if any are significant algorithmically [43].

JGGA [3], [4], [41] is an adaptation of NSGAII [16] – an elitist multiobjective genetic algorithm. NSGAII has several advantages over the currently available MOEAs. These have been reviewed by Deb [9]. NSGAII uses the concept of elitism, which unfortunately reduces the diversity of the population to some extent. In the 1940s, McClintock stated in her Nobel-prize winning work that DNAs, called jumping genes (or transposons), could jump in and out of chromosomes [42]. However, it was recognized that jumping genes could provide additional diversity. Inspired by this phenomenon, we implemented JGGA, integrating the jumping genes operator within the framework of NSGAII. JGGA was found to overcome the drawback in diversity caused by elitism in the NSGAII. The jumping genes paradigm is, therefore, an appropriate means for multiobjective functions. Each of these functions may perform a certain part of a system’s evaluation, which indirectly induces stress on the chromosome that may cause the gene movements around the chromosome. Consequently, this creates more chances for genes to jump to a new position and thus ultimately generates a higher probability to avoid a pseudo-equilibrium point.

Traditional GAs employed binary-string encoding are not suitable for many continuous optimization problems [1], [28], [30]. Instead, the real-coding is more suitable for these types of problems. The real-coding approach seems adequate when tackling optimization problems of parameters with variables in continuous domains [13], [28]. The original JGGA was implemented in the form of binary-coding and there were no exceptions from other BCGAs in dealing with the continuous search problems. Handling continuous search space with BCGAs has the following drawbacks: the inability to achieve arbitrary precision in the solution obtained, fixed mapping of problem variables, a Hamming cliff problem associated with binary-coding, and the processing of Holland’s schemata in continuous search space. Real-coded GA (RCGA) makes the representation of the solution closer to the formulation of problems and avoids these drawbacks [57]. These motivated us to remodel the existing JGGA into a real-coded system. In order to introduce the concept and verification of RJGGA, this paper provides the basic concept of a jumping gene phenomenon and the necessary modification required for transforming JGGA into RJGGA.

To justify the efficiency of the RJGGA, we systematically compare RJGGA with various existing MOEAs using five carefully chosen test functions. Each test function involves a particular feature that is known to cause difficulty in an evolutionary optimization process. As there are several complementary goals in multiobjective optimization, it is hard to define its quality by a single performance metric precisely. In this paper, a total of five performance metrics are applied to compare our proposed RJGGA with other MOEAs. These performance metrics examine the convergence and diversity of the solutions obtained along the Pareto-optimal front as well as the relative domination of one algorithm over another.

This paper is organized as follows. In Section 2, the fundamental principles of jumping genes and their computational operations are discussed. Comparisons of RCGAs with BCGAs are given in Section 3. In Section 4, the implementation of RJGGA is described in detail. The choice and justification of performance metrics and test functions are given in Section 5. Section 6 contains the experimental results and analysis. Finally, conclusions are presented in Section 7.

Section snippets

Jumping gene genetic algorithm (JGGA)

During the past few years, a number of evolutionary algorithms (EAs) were suggested to solve multiobjective optimization problems. However, none of these optimization techniques is perfect [37]. Hence, the search continues. Since Holland proposals back in 1975, two main genetic operators, crossover and mutation, have been explored with success. Nevertheless, in nature there exist many more mechanisms for genetic recombination based on phenomena like gene insertion, duplication or movement. JGGA

Real-coding genetic algorithm

BCGAs use binary-strings to code the problem variables in solving problems having continuous search space. When a binary-coding is used for continuous search space, a number of difficulties arise and it becomes problematic in a real-life situation. An inability to achieve any arbitrary precision in the optimal solution is one such problem. In BCGAs, the string length must be chosen initially in order to enable the GAs to achieve a certain precision in the solution. The more precision required,

Real jumping gene genetic algorithm

In RCGAs, a difficulty arises with search operators. BCGAs code the decision variables in finite-length binary-strings, and exchanging portions of the two parent strings can be implemented and visualized easily. Simply flipping a bit to perform mutation is convenient and it resembles a natural mutation event. In RCGAs, the main challenge is how to use a pair of real-parameter decision variable vectors to create a new pair of offspring vectors, or how to perturb a decision variable vector to a

Performance metrics and test problems

It is very important to choose appropriate metrics to validate an EA. However, when dealing with multiobjective optimization problems, there are several reasons why a qualitative assessment of results becomes difficult [5]. The initial problem is that it generates several solutions instead of one. Again, that the stochastic nature of EA makes it necessary to perform several runs to assess their performance. Thus, the results have to be validated using statistical analysis tools.

Since the goal

Performance assessments and comparisons

This section presents the comparison of RJGGA with other existing well-known MOEAs based on the five benchmark test functions using the five metrics described above. The MOEAs included in the comparison are: NSGAII [16], NPGA [34], MOGA [21], [22], SPEA [65], VEGA [47], HLGA [27] and PAES [38], [39].

For all the algorithms; the population size, crossover rate, mutation rate and maximum number of generations are 100, 0.9, 1/n or 1/l (where n is the number of decision variables for real-coded GAs

Conclusions

In this paper, we have proposed the RJGGA as a real-parameter improvement on JGGA. The most important feature of RJGGA is its capability to exploit local search heuristics which enables the gene(s) to jump from one position to another either within its own or to another chromosome(s) under multiple stresses. The jumping gene transposition is an additional operator to the existing genetic operators that introduces genetic variation in the natural population. Indeed, the jumping gene operators

Acknowledgements

The authors wish to thank Prof. Brian Ralph of Brunel University for his feedback. His comments helped us to substantially improve the quality of this paper and to make it more readable. We also gratefully acknowledge the support from City University Strategic Grant 7001955.

References (67)

  • T.M. Chan et al.

    A jumping gene algorithm for multiobjective resource management in wideband CDMA systems

    The Computer Journal

    (2005)
  • T.M. Chan, K.F. Man, K.S. Tang, S. Kwong, Multiobjective optimization of radio-to-fiber repeater placement using a...
  • C.A. Coello Coello

    Recent trends in evolutionary multiobjective optimization

  • C.A. Coello Coello et al.

    Evolutionary Algorithms for Solving Multi-Objective Problems

    (2002)
  • C.A. Coello Coello et al.

    Applications of Multi-Objective Evolutionary Algorithms

    (2004)
  • L.H. Caporale

    Jumping genes

  • K. Deb

    Multiobjective Optimization using Evolutionary Algorithms

    (2001)
  • K. Deb

    Multi-objective genetic algorithms: problem difficulties and construction of test functions

    Evolutionary Computation

    (1999)
  • K. Deb, Construction of test problems for multi-objective optimization, in: Proceedings of the Genetic and Evolutionary...
  • K. Deb, Multi-objective genetic algorithms: problem difficulties and construction of test problems, Technical Report...
  • K. Deb et al.

    Self-adaptive genetic algorithms with simulated binary crossover

    Evolutionary Computation

    (2001)
  • K. Deb et al.

    Simulated binary crossover for continuous search space

    Complex Systems

    (1995)
  • K. Deb et al.

    A combined genetic adaptive search (GeneAS) for engineering design

    Computer Science and Informatics

    (1996)
  • K. Deb et al.

    A fast and elitist multiobjective genetic algorithm: NSGA-II

    IEEE Transaction on Evolutionary Computation

    (2002)
  • K. Deb et al.

    Scable test problems for evolutionary multiobjective optimization

  • F.Y. Edgeworth

    Mathematical Physics

    (1881)
  • S. Esquivel, H. Leiva, R. Gallard, Multiple crossovers between multiple parents to improve search in evolutionary...
  • C.M. Fonseca, P.J. Fleming, Multi-objective genetic algorithm made easy: selection, sharing and mating restriction, in:...
  • C.M. Fonseca et al.

    Genetic algorithm for multiobjective optimization, formulation, discussion and generalization

  • D.E. Goleberg

    Genetic Algorithms for Search, Optimization, and Machine Learning

    (1989)
  • D.E. Goldberg et al.

    Genetic algorithms, noise, and the sizing of populations

    Complex Systems

    (1992)
  • C. Guria et al.

    Multi-objective optimal synthesis and design of froth flotation circuits for mineral processing using the jumping gene adaptation of genetic algorithm

    Industrial and Engineering Chemistry Research

    (2005)
  • Cited by (140)

    View all citing articles on Scopus
    View full text