Elsevier

Applied Soft Computing

Volume 73, December 2018, Pages 67-82
Applied Soft Computing

A novel hybrid bat algorithm for solving continuous optimization problems

https://doi.org/10.1016/j.asoc.2018.08.012Get rights and content

Highlights

  • A novel hybrid bat algorithm (HBA) based on three modification methods is proposed.

  • The bat algorithm is hybridized with extremal optimization (EO) algorithm.

  • The contribution of three modifications is analyzed by classical benchmark functions.

  • The performance of HBA is validated by CEC 2014 test functions.

  • Prove that HBA improves significantly the performance of the bat algorithm.

Abstract

The Bat Algorithm (BA), which is a global optimization method, performs poorly on complex continuous optimization problems due to BA’s disadvantages such as the premature convergence problem. In this paper, we propose a novel Hybrid Bat Algorithm (HBA) to improve the performance of BA. Three modification methods are incorporated into the standard BA to enhance the local search capability and the ability to escape from local optimum traps. The effectiveness and contribution of these three modification methods are analyzed by using classical benchmark functions. Moreover, the performance of HBA is evaluated on the numerical functions from the CEC 2014 test suite and compared with those of well-known optimization algorithms. The statistical test results indicate that HBA is a significant improvement.

Introduction

There are many multimodal optimization problems in engineering and science. Therefore, an algorithm must be able to deal with these problems in addition to unimodal optimization problems. Evolutionary Computing (EC), which is based on the biological concept of populations and iterative improvements, is a powerful optimization method, despite its slow convergence speed [1], [2], [3]. EC mainly involves Evolutionary Algorithms (EAs) and Swarm Intelligence (SI). EAs are effective optimization techniques for finding global solutions to complex problems with several decision variables and constraints [4]. Genetic Algorithm (GA) [5], Evolutionary Programming (EP) [6], Evolutionary Strategies (ESs) [7], [8], Genetic Programming (GP) [9], Differential Evolution (DE) [10], Estimation of Distribution Algorithm (EDA) [11], and Immune System Optimization [12] are the most representative implementations of the EA concept. However, their evolutionary patterns and principles are not the same. SI is inspired by animal behavior patterns and has aroused the attention of scientists. Many biological metaphors were used when proposing new metaheuristic methods for solving optimization problems. The typical examples of SI include Particle Swarm Optimization (PSO) [13], Fruit Fly Optimization Algorithm (FOA) [14], Artificial Bee Colony Algorithm (ABC) [15], Cuckoo Search (CS) [16], Ant Colony Optimization (ACO) [17], Cat Swarm Optimization (CSO) [18], Firefly Algorithm (FA) [19], Artificial Fish Swarm Algorithm (AFSA) [20], and Shuffled Frog Leaping Algorithm (SFLA) [21]. Most of these traditional optimization algorithms achieve excellent performance on a small number of instances that are very small in size. However, as the complexity of the problems increases, their adaptability deteriorates [22].

Recently, inspired by the echolocation behavior of micro-bats with varying pulse rates of emission and loudness, the bat algorithm (BA) was proposed by Yang [23] as a novel swarm intelligence method. Due to the echolocation capability of micro-bats, these bats can not only find their prey and discriminate among types of insects even in complete darkness, but also precisely determine the distance, shape and location of prey.

The search mechanism of an algorithm has two crucial components: One is exploration, which is to find promising solutions by seeking new and unknown regions, and the other is exploitation, which is to improve all solutions that were obtained by exploration. Several researchers have reported that the standard BA can effectively optimize low-dimensional functions and be applied to real-world optimization problems, and proved that BA is more effective and robust than other standard optimization algorithms [23], [24]. Nevertheless, there are still many disadvantages. For example, the premature convergence problem may occur in certain optimization problems. Thus, improving the performance of existing methods is important for the development of powerful optimization methods [22].

In this paper, a novel hybrid bat algorithm (HBA) is proposed to improve the performance of the standard BA on complex continuous optimization problems. Three modification methods are embedded into the standard BA. The significant improvement of HBA is to enhance the local search capability and the ability to jump out of local optima. The effectiveness and contribution of these three modification methods are evaluated by classical benchmark functions [25], [26]. Moreover, the superiority of the proposed algorithm is demonstrated on the CEC 2014 test functions [27] in comparison with several standard algorithms, advanced optimization algorithms, and improved bat algorithms.

The rest of this paper is organized as follows. Section 2 gives a literature review of BA. The standard BA is summarized in Section 3. The hybrid bat algorithm is described in detail in Section 4. The comparative results of numerical experiments are presented and discussed in Section 5. Finally, Section 6 presents the conclusions of the work and provides suggestions for future work.

Section snippets

Literature review

Since the BA was proposed, it has become increasingly important in solving continuous and discrete optimization problems due to its simple application structures, ease of implementation, and effectiveness. To solve more complex problems, additional variants of BA have been developed.

Li and Zhou [28] presented a new bat algorithm (CBA) that is based on complex-valued encoding in which the real part and the imaginary part are updated separately. This algorithm can not only increase the diversity

Bat algorithm

The standard bat algorithm, which is a novel metaheuristic swarm intelligence optimization method for global numerical optimization, was proposed by Yang [23], who was inspired by the social behavior of micro-bats and their capability of using echolocation to sense distance. Three major characteristics of the echolocation process of the micro-bats are idealized. The idealized rules in BA are defined as follows [23]:

  • All bats use echolocation to sense distance, and they also ‘know’ the difference

A novel hybrid bat algorithm

The standard bat algorithm performs poorly on complex continuous optimization problems [45]. In this paper, we will introduce three modification methods, which are called M1, M2, and M3, into BA to enhance its local search capability and ability to jump out of local optima.

Experiments and discussions

To evaluate the performance of the proposed algorithm, we conducted two numerical experiments. Classical benchmark functions [25], [26] were used to analyze the contributions of the different modification methods. Additionally, CEC 2014 test functions [27] were utilized to validate the feasibility and effectiveness of the proposed algorithm on complex continuous optimization problems. All experiments were executed by using MATLAB R2012a on the same Intel (R) Core (TM) i5-2400 CPU @ 3.10 GHz

Conclusion

In this paper, a novel hybrid bat algorithm (HBA) is proposed for complex continuous optimization problems. Three modification methods are combined with the standard BA: modification of the initial population (M1), modification of location updating (M2), and hybridization with extremal optimization (M3). M1 focuses on enhancing the diversity of the initial bat population in the proposed algorithm. M2 is designed to improve the local search capability during later iterations through embedding a

Acknowledgment

The authors would like to acknowledge the National Science and Technology Major Project of China for supporting this study through the project “The seventh generation ultra-deepwater drilling platform (ship) innovation project” with grant number D719.

References (64)

  • YilmazS. et al.

    A new modification approach on bat algorithm for solving optimization problems

    Appl. Soft Comput.

    (2015)
  • XiaoL. et al.

    Multi-step wind speed forecasting based on a hybrid forecasting architecture and an improved bat algorithm

    Energy Convers. Manage.

    (2017)
  • MengX. et al.

    A novel bat algorithm with habitat selection and Doppler effect in echoes for optimization

    Expert Syst. Appl.

    (2015)
  • TavazoeiM.S. et al.

    An optimization algorithm based on chaotic behavior and fractal nature

    J. Comput. Appl. Math.

    (2007)
  • ZhangC.G. et al.

    Scale-free fully informed particle swarm optimization algorithm

    Inform. Sci.

    (2011)
  • ChenM.R. et al.

    A novel particle swarm optimizer hybridized with extremal optimization

    Appl. Soft Comput.

    (2010)
  • TopalA.O. et al.

    A novel meta-heuristic algorithm: Dynamic Virtual Bats Algorithm

    Inform. Sci.

    (2016)
  • DerracJ. et al.

    A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms

    Swarm Evol. Comput.

    (2011)
  • GalletlyJ.

    Evolutionary algorithms in theory and practice

    Complexity

    (1996)
  • HollandJ.H.

    Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence

    Q. Rev. Biol.

    (1992)
  • EibenA.E. et al.

    Evolutionary programming

  • BeyerH.G.

    The Theory of Evolution Strategies

    (2001)
  • BeyerH.G. et al.

    Covariance matrix adaptation revisited—the CMSA evolution strategy

  • HaeriM.A. et al.

    Statistical genetic programming for symbolic regression

    Appl. Soft Comput.

    (2017)
  • DasS. et al.

    Differential evolution: a survey of the state-of-the-art

    IEEE Trans. Evol. Comput.

    (2011)
  • LarrañagaP. et al.

    Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation

    (2002)
  • HofmeyrS.A. et al.

    Architecture for an artificial immune system

    Evol. Comput.

    (2000)
  • KennedyJ. et al.

    Particle swarm optimization

  • ChuS.C. et al.

    Cat Swarm Optimization, Trends in Artificial Intelligence

    (2006)
  • YangX.S. et al.

    Firefly algorithm: Recent advances and applications

    Int. J. Swarm Intell.

    (2013)
  • NeshatM. et al.

    Artificial fish swarm algorithm: a survey of the state-of-the-art, hybridization, combinatorial and indicative applications

    Artif. Intell. Rev.

    (2014)
  • MuzaffarE. et al.

    Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization

    Eng. Optim.

    (2006)
  • Cited by (0)

    View full text