A real-coding jumping gene genetic algorithm (RJGGA) for multiobjective optimization
Introduction
In the world around us, there are few problems concerned with a single value or objective. Instead, most problems present multiple objectives that often conflict with each other. These objectives have to be met or optimized before any adequate solution is reached. The problems having two or more objective functions are called “multiobjective” problems, and it is rarely the case that a single solution can simultaneously satisfy all existing objectives. Usually, the trade-off among such conflicting objectives is a must. In fact, the notion of “optimality” changes when dealing with multiobjective optimization problems. This notion was originally introduced by Edgeworth [18] and later generalized by Pareto [44]. It is called an Edgeworth–Pareto optimum or, simply a Pareto-optimum.
Evolutionary multiobjective optimization has attracted significant attention from researchers in various fields due to its effectiveness and robustness in searching a set of trade-off solutions. In addition, its real-world application has become increasingly popular in the last few years [5], [7], [23], [31], [32], [36], [40], [54], [55]. Unlike conventional methods that aggregate multiple attributes to form a composite scalar objective function, MOEAs are capable of considering each objective component separately and guiding the search to discover the global Pareto-optimal front. The main challenges in a multiobjective optimization environment are to minimize the distance of the generated solutions to the Pareto set, and to maintain the diversity of the developed Pareto set. An MOEA will be considered “good” only if both goals are satisfied simultaneously. The potential of evolutionary algorithms to solve multiobjective optimization problems was hinted at in the late 1960s by Rosenberg [46]. The first implementation of MOEA dates to the mid-1980s as VEGA [48]. Since then, a considerable amount of research has been done in this area. Comprehensive overviews and discussions can be found on several recent surveys [5], [6], [9], [55]. Most of them are motivated by Goldberg’s suggestion of non-dominated GA [24]. A representative collection of these algorithms includes HLGA [27], MOGA [21], [22], NPGA [34], SPEA [65], PAES [38], [39], NSGAII [16], etc.
In this paper, a new evolutionary algorithm, real jumping gene genetic algorithm (RJGGA) is proposed as an improvement on JGGA. JGGA incorporates the concept of jumping gene phenomenon discovered by Nobel Laureate McClintock in her work on the corn plants [20]. The main feature of JGGA is that it consists of a simple operation in which a transposition of gene(s) is induced within the same or another chromosome under the GA framework. The niche of this system is that it enables gene mobility within the same, or even to the neighboring, chromosome(s) in its search for appropriate non-dominated solutions in the context of multiobjective optimization problems. In conventional evolutionary computing, a chromosome is weighted by the fitness function, and it can evolve from generation to generation when the fitness function is evaluated. For a single fitness function, no individual gene in the chromosome can be excited consistently during the entire evolutionary process. As a result, a premature convergence often results because the chromosome or its genes have not had severe enough stress to enable the genes to jump to another more stable position.
The power and success of GA are mostly achieved through the diversity of the individuals in a population that evolved under the principle of “the survival of the fittest” [51]. This is accomplished by using classical genetic crossover and mutation operators. In fact, the genetic diversity of individuals can also be achieved by other mechanisms, such as gene insertion, duplication or movement. In this respect, Mitchell and Forrest pointed out the importance of studying other mechanisms for rearranging genetic material (e.g., jumping genes, gene deletion duplication, introns and exons) to understand if any are significant algorithmically [43].
JGGA [3], [4], [41] is an adaptation of NSGAII [16] – an elitist multiobjective genetic algorithm. NSGAII has several advantages over the currently available MOEAs. These have been reviewed by Deb [9]. NSGAII uses the concept of elitism, which unfortunately reduces the diversity of the population to some extent. In the 1940s, McClintock stated in her Nobel-prize winning work that DNAs, called jumping genes (or transposons), could jump in and out of chromosomes [42]. However, it was recognized that jumping genes could provide additional diversity. Inspired by this phenomenon, we implemented JGGA, integrating the jumping genes operator within the framework of NSGAII. JGGA was found to overcome the drawback in diversity caused by elitism in the NSGAII. The jumping genes paradigm is, therefore, an appropriate means for multiobjective functions. Each of these functions may perform a certain part of a system’s evaluation, which indirectly induces stress on the chromosome that may cause the gene movements around the chromosome. Consequently, this creates more chances for genes to jump to a new position and thus ultimately generates a higher probability to avoid a pseudo-equilibrium point.
Traditional GAs employed binary-string encoding are not suitable for many continuous optimization problems [1], [28], [30]. Instead, the real-coding is more suitable for these types of problems. The real-coding approach seems adequate when tackling optimization problems of parameters with variables in continuous domains [13], [28]. The original JGGA was implemented in the form of binary-coding and there were no exceptions from other BCGAs in dealing with the continuous search problems. Handling continuous search space with BCGAs has the following drawbacks: the inability to achieve arbitrary precision in the solution obtained, fixed mapping of problem variables, a Hamming cliff problem associated with binary-coding, and the processing of Holland’s schemata in continuous search space. Real-coded GA (RCGA) makes the representation of the solution closer to the formulation of problems and avoids these drawbacks [57]. These motivated us to remodel the existing JGGA into a real-coded system. In order to introduce the concept and verification of RJGGA, this paper provides the basic concept of a jumping gene phenomenon and the necessary modification required for transforming JGGA into RJGGA.
To justify the efficiency of the RJGGA, we systematically compare RJGGA with various existing MOEAs using five carefully chosen test functions. Each test function involves a particular feature that is known to cause difficulty in an evolutionary optimization process. As there are several complementary goals in multiobjective optimization, it is hard to define its quality by a single performance metric precisely. In this paper, a total of five performance metrics are applied to compare our proposed RJGGA with other MOEAs. These performance metrics examine the convergence and diversity of the solutions obtained along the Pareto-optimal front as well as the relative domination of one algorithm over another.
This paper is organized as follows. In Section 2, the fundamental principles of jumping genes and their computational operations are discussed. Comparisons of RCGAs with BCGAs are given in Section 3. In Section 4, the implementation of RJGGA is described in detail. The choice and justification of performance metrics and test functions are given in Section 5. Section 6 contains the experimental results and analysis. Finally, conclusions are presented in Section 7.
Section snippets
Jumping gene genetic algorithm (JGGA)
During the past few years, a number of evolutionary algorithms (EAs) were suggested to solve multiobjective optimization problems. However, none of these optimization techniques is perfect [37]. Hence, the search continues. Since Holland proposals back in 1975, two main genetic operators, crossover and mutation, have been explored with success. Nevertheless, in nature there exist many more mechanisms for genetic recombination based on phenomena like gene insertion, duplication or movement. JGGA
Real-coding genetic algorithm
BCGAs use binary-strings to code the problem variables in solving problems having continuous search space. When a binary-coding is used for continuous search space, a number of difficulties arise and it becomes problematic in a real-life situation. An inability to achieve any arbitrary precision in the optimal solution is one such problem. In BCGAs, the string length must be chosen initially in order to enable the GAs to achieve a certain precision in the solution. The more precision required,
Real jumping gene genetic algorithm
In RCGAs, a difficulty arises with search operators. BCGAs code the decision variables in finite-length binary-strings, and exchanging portions of the two parent strings can be implemented and visualized easily. Simply flipping a bit to perform mutation is convenient and it resembles a natural mutation event. In RCGAs, the main challenge is how to use a pair of real-parameter decision variable vectors to create a new pair of offspring vectors, or how to perturb a decision variable vector to a
Performance metrics and test problems
It is very important to choose appropriate metrics to validate an EA. However, when dealing with multiobjective optimization problems, there are several reasons why a qualitative assessment of results becomes difficult [5]. The initial problem is that it generates several solutions instead of one. Again, that the stochastic nature of EA makes it necessary to perform several runs to assess their performance. Thus, the results have to be validated using statistical analysis tools.
Since the goal
Performance assessments and comparisons
This section presents the comparison of RJGGA with other existing well-known MOEAs based on the five benchmark test functions using the five metrics described above. The MOEAs included in the comparison are: NSGAII [16], NPGA [34], MOGA [21], [22], SPEA [65], VEGA [47], HLGA [27] and PAES [38], [39].
For all the algorithms; the population size, crossover rate, mutation rate and maximum number of generations are 100, 0.9, 1/n or 1/l (where n is the number of decision variables for real-coded GAs
Conclusions
In this paper, we have proposed the RJGGA as a real-parameter improvement on JGGA. The most important feature of RJGGA is its capability to exploit local search heuristics which enables the gene(s) to jump from one position to another either within its own or to another chromosome(s) under multiple stresses. The jumping gene transposition is an additional operator to the existing genetic operators that introduces genetic variation in the natural population. Indeed, the jumping gene operators
Acknowledgements
The authors wish to thank Prof. Brian Ralph of Brunel University for his feedback. His comments helped us to substantially improve the quality of this paper and to make it more readable. We also gratefully acknowledge the support from City University Strategic Grant 7001955.
References (67)
- et al.
Multi-objective rule mining using genetic algorithms
Information Sciences
(2004) - et al.
MRCD: a genetic algorithm for multiobjective robust control design
Engineering Applications of Artificial Intelligence
(2002) Soft computing techniques for the design of mobile robot behaviors
Information Sciences
(2000)- et al.
Three-objective genetics-based machine learning for linguistic rule extraction
Information Sciences
(2001) - et al.
Multi-objective optimization of an industrial fluidized-bed catalytic cracking unit (FCCU) using genetic algorithm (GA) with the jumping genes operator
Computers and Chemical Engineering
(2003) - et al.
The use of multiple objective genetic algorithm in self-healing network
Applied Soft Computing
(2002) - et al.
DE/EDA: a new evolutionary algorithm for global optimization
Information Sciences
(2005) - et al.
Search space boundary extension method in real-coded genetic algorithms
Information Sciences
(2001) - et al.
Real-parameter genetic algorithms for finding multiple optimal solutions in multi-modal optimization
- J. Branke, Memory enhanced evolutionary algorithms for changing optimization problems, in: Proceedings of the Congress...
A jumping gene algorithm for multiobjective resource management in wideband CDMA systems
The Computer Journal
Recent trends in evolutionary multiobjective optimization
Evolutionary Algorithms for Solving Multi-Objective Problems
Applications of Multi-Objective Evolutionary Algorithms
Jumping genes
Multiobjective Optimization using Evolutionary Algorithms
Multi-objective genetic algorithms: problem difficulties and construction of test functions
Evolutionary Computation
Self-adaptive genetic algorithms with simulated binary crossover
Evolutionary Computation
Simulated binary crossover for continuous search space
Complex Systems
A combined genetic adaptive search (GeneAS) for engineering design
Computer Science and Informatics
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transaction on Evolutionary Computation
Scable test problems for evolutionary multiobjective optimization
Mathematical Physics
Genetic algorithm for multiobjective optimization, formulation, discussion and generalization
Genetic Algorithms for Search, Optimization, and Machine Learning
Genetic algorithms, noise, and the sizing of populations
Complex Systems
Multi-objective optimal synthesis and design of froth flotation circuits for mineral processing using the jumping gene adaptation of genetic algorithm
Industrial and Engineering Chemistry Research
Cited by (140)
Planning of prescriptive maintenance types for generator with fuzzy logic-based genetic algorithm in a hydroelectric power plant
2024, Expert Systems with ApplicationsAllocation of distributed generations in radial distribution systems using adaptive PSO and modified GSA multi-objective optimizations
2020, Alexandria Engineering JournalDiagnostic module for series-connected photovoltaic panels
2020, Solar EnergyA New Optimization Model for MLP Hyperparameter Tuning: Modeling and Resolution by Real-Coded Genetic Algorithm
2024, Neural Processing Letters