Elsevier

Applied Soft Computing

Volume 95, October 2020, 106347
Applied Soft Computing

A competitive chain-based Harris Hawks Optimizer for global optimization and multi-level image thresholding problems

https://doi.org/10.1016/j.asoc.2020.106347Get rights and content

Highlights

  • This paper presents an enhanced Harris Hawks Optimizer (HHO).

  • A comprehensive experiment is performed using more than forty five benchmark problems.

  • Extensive results show the more stable performance of the proposed variant of HHO.

Abstract

This paper presents an enhanced Harris Hawks Optimizer (HHO) to tackle global optimization and determine the optimal threshold values for multi-level image segmentation problems. HHO is a new swarm-based metaheuristic technique that simulates the behaviors of Harris hawks during the process of catching the rabbits. The HHO established its strong performance as a swarm-based optimization technique. However, population-based HHO still may face some limitations in dealing with more multi-modal and composition problems. For example, this optimizer may be stagnated to local optima and turned to immature convergence when performing phases of exploration and exploitation. To mitigate these drawbacks, an improved HHO is proposed that considers the salp swarm algorithm (SSA) as a competitive method to enhance the balance between its exploration and exploitation trends. Firstly, a set of solutions is generated. Then, we divide those solutions into two halves, where the exploratory and exploitative phases of HHO will be applied to the first half, and the searching stages of SSA will be used to update the solutions in the second half. Thereafter, the best solutions from the union sub-populations are selected to continue the iterative process. According to the improved HHO, which is called HHOSSA, an effective multi-level image segmentation approach is also developed in this research. A comprehensive set of experiments are performed using 36 IEEE CEC 2005 benchmark functions and 11 natural gray-scale images. Extensive results and comparisons show the high ability of the SSA to improve the HHO’s performance since the proposed HHOSSA achieves a more stable performance compared to HHO, SSA, and many other well-known methods.

Introduction

In recent years, stochastic optimizers have obtained an increasing momentum among researchers in tackling problems in various fields [1]. Among these methods, swarm-based optimizers have gained increasing momentum. These algorithms can search the feature space based on a sequence of decentralized, self-organized, and cooperating behaviors. Search agents can communicate and obtain some info about the target landscape. The main evolutionary foundations of swarm-based techniques are inspired by the foraging of animals, the survival of the fittest, and hunting [2]. These generic methods are problem-independent (black-box), and they can realize estimated solutions to problems with a wide range of decision variables based on their stochastic searching phases [3]. Many years of observations reveal that they can achieve very competitive and near-optimal solutions in comparison with several deterministic methods [4], [5]. The advantage of these approaches is that they perform well-balanced exploration and exploitation phases with acceptable convergence rates [6]. Furthermore, they can attain optimum or sub-optimal results based on iterative stochastic operations. Eminent swarm-based evolutionary methods are genetic algorithm (GA) [7], particle swarm optimizer (PSO) [8], and differential evolution (DE) [9]. The success and validity of these well-regarded techniques in dealing with hard computational cases have strengthened the soft computing community to employ and develop more efficient swarm-based optimizers. Some of the latest well-established methods are whale-inspired optimizer (WOA) [10], moth-flame optimizer (MFO) [11], salp swarm algorithm (SSA) [12], fruit fly optimization (FFO) [13], grasshopper optimizer (GOA) [14], bacterial foraging optimization (BFO) [15], and gray wolf optimizer (GWO) [16].

Harris hawks optimizer is one of the recent optimizers developed by Heidari et al. [17] to deal with continuous problems. HHO is developed based on the interactions among hawks that try to catch the prey. This method consists of six phases of exploration and exploitation. Results and observations confirm that the conventional HHO has several performance advantages such as fast convergence and a stable balance between the searching phases. Primarily, it shows significant trends of exploitation in the last stages. Due to several reasons such a dynamic structure and active mathematical operators, HHO can reveal enhanced results compared with many well-established optimizers such as MFO, WOA, CS, PSO, DE, and GWO method. HHO employ randomized processes to perform exploration and exploitation and find optimal solutions. Despite this novelty of HHO, there is no work to investigate the efficacy of this optimizer on more problems. Also, although this method is an efficient enough swarm-based method, it may suffer from stagnation to local optima and immature convergence shortcomings in dealing with complex problems.

The mathematical model of swarm-based SSA is developed to deal with the unconstrained and constrained optimization tasks, efficiently. This method inspires the behaviors of salps and their intelligently constructed chains in oceans. This algorithm attracted much attention right after appearance because of its straightforwardness and efficacy. Until now, many papers tried to enhance or utilize the original variant of SSA for tackling single or multi-objective problems. Recently, Faris et al. [18] reviewed the main works on SSA, comprehensively. We also review some main contributions.  Hussien et al. [19] proposed the use of SSA in predicting chemical compound activities.  Ekinci and Hekimoglu [20] proposed the application of SSA in discovering the parameters of the power system stabilizer in multi-machine power systems. Mohapatra and Sahu [21] employed the original SSA to realize and establish the fractional-order proportional–integral–derivative controller. Asaithambi and Rajappa [22] enhanced a computing method based on SSA to obtain the sizing of a CMOS differential amplifier efficiently, and results are promising. Also, SSA has revealed enhanced results in dealing with polarization curves in polymer exchange membrane fuel cells, determining the active power of an isolated microgrid [23], parameter identification of photovoltaic cell models [24], and optimal solutions of PID-Fuzzy controller [25]. However, SSA has some drawbacks, such as unstable performance and slow convergence rate. To mitigate these shortcomings, a chaotic version proposed by Sayed et al. [26]. Faris et al. [27] developed an enhanced binary-encoded SSA to realize the feature selection (FS) datasets. Due to the acceptable performance of binary SSA, another version with asynchronous structure updating and the multi-leadership operator was proposed by Aljarah et al. [12]. Singh et al. [28] fused the SSA with the sine-cosine algorithm (SCA) to enhance its balance between exploration and exploitation trends. Masdari et al. [29] combined discrete vortex approach and SSA to optimize airfoil-based Savonius wind turbines, and the results are enhanced compared to the SSA, significantly. Yang et al. [30] extended SSA with independent chains to deal with MPPT for PV systems.

The problems mentioned above (stagnation and immature convergence) are two critical problems that all metaheuristic methods face due to their randomized exploration and exploitation operators [31], [32]. In the first one, immature convergence, the quality of solutions degrades or does not enhance with more iteration due to limited (or not enough) exploration capacity in initial stages of optimization. In this case, the method cannot jump out of local optima. As a result, the optimizer starts to switch to exploitative trends before sufficient exploration, which in turn, sacrifices the quality of solutions, tragically. In the second problem, the optimizer cannot find any better solution than a local optimum. Hence, if we monitor the variations of fitness values, no more significant improvement or evolutionary trend can be spotted. Stagnation shortcoming can take place within the exploration stage or exploitation phase or both of them. In the latter case that stagnation is observed in both phases, the algorithm suffers from limited exploration patterns, the non-smooth shift from exploration to exploitation, and performing extra exploitation procedures. Both of these problems are highly interrelated. In general, extra exploration or exploitation both lead to the unstable performance of the main algorithm. For this reason, maintaining a steady balance between the exploratory and exploitative propensities can guarantee the quality of solutions obtained by most of the swarm-based optimizers [33], [34].

HHO is a new powerful optimizer that still can be enhanced in terms of convergence speed and local optima avoidance. Progressive selection scheme, the dynamic time-varying behavior of escaping energy parameters, and levy-based searching phases are the main advantages of HHO compared to SSA. HHO attracted researchers, and they widely applied this optimizer to their problems in various disciplines. Some of these cases are prediction of slope stability [35], landslide susceptibility [36], satellite image segmentation [37], reconfiguration for alleviating the partial shading [38], [39] image de-noising for satellite datasets [40], thresholding of color image multilevel [41], harmonic overloading optimization [42], and assessment of bearing capacity of footings for two-layer foundation soils [43].

Jia et al. [37] proposed a modified HHO with a mutation operator. The results of the algorithm for the satellite images show its improved results and efficacy. Bao et al. [41] also announced a fused version based on differential evolution to tackle the multilevel thresholding of color images. Moayedi et al. [43] also applied this approach to optimize the structure of multi-layer perceptron (MLP) and high-quality accuracy rates observed within the optimization.

In this paper, operators of the salp swarm algorithm (SSA) are considered as competing rules to enhance the balance between the exploration and exploitation inclinations of HHO. Firstly, a set of agents is generated. Then, we divide those solutions into two halves, where the exploratory and exploitative phases of HHO will be applied to the first half, and the searching stages of SSA will be used to update the solutions in the second half. Afterward, the solutions from the united sub-populations are selected to be evolved in the next iterations. Using the modified HHO, an enhanced multi-level image segmentation technique is developed. Based on a comprehensive set of evaluations and more than 45 numerical and applied benchmark cases, we observe that the results are improved compared to original HHO and several other well-established methods in the field.

The main contributions of this paper are:

  • This paper proposes a new hybrid, single-objective, modified HHO.

  • The proposed method uses the cores of SSA to improve the performance of the HHO in a way that both of them are competing together to find the optimal solution.

  • The proposed method HHOSSA is applied to tackle the thirty-six global optimization problems. In addition, we compare its performance with several well-establishedmetaheuristic methods.

  • The proposed HHOSSA is applied to develop a multi-level image segmentation method for finding the optimal threshold value of a set of datasets with eleven images.

The rest of this paper is organized as follows: The background of HHO and SSA are explained in Section 2. The steps of the proposed HHO-based approach are illustrated in Section 3. The performance of the proposed method is evaluated by using a set of experiments, as given in Section 4. The conclusions and future works are given in Section 5.

Section snippets

Harris Hawks Optimizer (HHO)

This optimizer inspires the social life of one of the smartest birds in nature [17]. Harris hawks try to catch the prey based on team collaborations. The hawks first perch on some locations in the vicinity of the potential locations for the prey [44]. They often hunt rabbits in nature. The rabbits also show some escaping behaviors to increase the chance of survival. In this process, hawks try to perform several chasing styles like to increase their success rate. The conventional HHO algorithm

The proposed HHO-based approach

The structure of the proposed approach, called HHOSSA, is given in Fig. 4, which improves the HHO by using the operators of SSA. In the proposed HHOSSA approach, the SSA and HHO are performing competitively to find the best solution. In this case, the proposed HHOSSA benefits from the exploration and exploitation strengths of both methods.

The exploration and exploitation of any optimizer, including the HHO and SSA, have its limitations. In some multimodal cases, both basic approaches cannot

Experimental results and discussions

In this section, a set of experiments is performed to assess the efficacy and results of the proposed HHOSSA in terms of different metrics. In the first experimental study, the proposed method is utilized to deal with 36 problems from IEEE CEC 2005 competition [54]. In addition, to investigate the efficacy of the developed method for a practical case, the second experimental series aims to find the optimal threshold level for a set of benchmark images. For this purpose, we developed an image

Conclusion and future directions

In this paper, we proposed an enhanced HHO based on natural selection theory (NST) concepts that assumed the competitive and identical solutions would lead to improvements in the results and convergence. The proposed method called HHOSSA, which uses the SSA to be competitive with HHO during the process of updating the solutions. The proposed HHOSSA is used to tackle global optimization tasks and find the optimal threshold values for multilevel image segmentation problems. The proposed HHOSSA is

CRediT authorship contribution statement

Mohamed Abd Elaziz: Conceptualization, Methodology, Software, Validation, Formal analysis, Investigation, Writing - original draft, Writing - review & editing, Visualization, Project administration. Ali Asghar Heidari: Conceptualization, Methodology, Investigation, Software, Writing - original draft, Writing - review editing, Visualization, Project administration. Hamido Fujita: Supervision, Writing - review & editing. Hossein Moayedi: Writing - review & editing.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

We acknowledge the efforts and constructive comments of anonymous reviewers and respected editor for handling this research.

References (74)

  • ChenH. et al.

    An enhanced bacterial foraging optimization and its application for training kernel extreme learning machine

    Appl. Soft Comput.

    (2020)
  • MirjaliliS. et al.

    Grey wolf optimizer

    Adv. Eng. Softw.

    (2014)
  • HeidariA.A. et al.

    Harris hawks optimization: Algorithm and applications

    Future Gener. Comput. Syst.

    (2019)
  • El-FerganyA.A.

    Extracting optimal parameters of pem fuel cells using salp swarm optimizer

    Renew. Energy

    (2018)
  • AbbassiR. et al.

    An efficient salp swarm-inspired algorithm for parameters identification of photovoltaic cell models

    Energy Convers. Manage.

    (2019)
  • FarisH. et al.

    An efficient binary salp swarm algorithm with crossover scheme for feature selection problems

    Knowl.-Based Syst.

    (2018)
  • MasdariM. et al.

    Optimization of airfoil based savonius wind turbine using coupled discrete vortex method and salp swarm algorithm

    J. Cleaner Prod.

    (2019)
  • YangB. et al.

    Novel bio-inspired memetic salp swarm algorithm and application to mppt for pv systems considering partial shading condition

    J. Cleaner Prod.

    (2019)
  • WangM. et al.

    Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses

    Neurocomputing

    (2017)
  • XuY. et al.

    Enhanced moth-flame optimizer with mutation strategy for global optimization

    Inform. Sci.

    (2019)
  • WangM. et al.

    Chaotic multi-swarm whale optimizer boosted support vector machine for medical diagnosis

    Appl. Soft Comput.

    (2020)
  • ZhaoX. et al.

    Chaos enhanced grey wolf optimization wrapped elm for diagnosis of paraquat-poisoned patients

    Comput. Biol. Chem.

    (2019)
  • YousriD. et al.

    Fractional chaotic ensemble particle swarm optimizer for identifying the single, double, and three diode photovoltaic models’ parameters

    Energy

    (2020)
  • YousriD. et al.

    Optimal photovoltaic array reconfiguration for alleviating the partial shading influence based on a modified harris hawks optimizer

    Energy Convers. Manage.

    (2020)
  • RidhaH.M. et al.

    Boosted mutation-based harris hawks optimizer for parameters identification of single-diode solar cell models

    Energy Convers. Manage.

    (2020)
  • ChenH. et al.

    Parameters identification of photovoltaic cells and modules using diversification-enriched harris hawks optimization with chaotic drifts

    J. Cleaner Prod.

    (2020)
  • GaoW. et al.

    Study of biological networks using graph theory

    Saudi J. Biol. Sci.

    (2018)
  • GaoW. et al.

    Nano properties analysis via fourth multiplicative abc indicator calculating

    Arab. J. Chem.

    (2018)
  • GaoW. et al.

    Partial multi-dividing ontology learning algorithm

    Inform. Sci.

    (2018)
  • AbbassiA. et al.

    Parameters identification of photovoltaic cell models using enhanced exploratory salp chains-based approach

    Energy

    (2020)
  • MirjaliliS. et al.

    Grey wolf optimizer

    Adv. Eng. Softw.

    (2014)
  • KarabogaD. et al.

    A comparative study of artificial bee colony algorithm

    Appl. Math. Comput.

    (2009)
  • MirjaliliS. et al.

    Multi-objective grey wolf optimizer: a novel algorithm for multi-criterion optimization

    Expert Syst. Appl.

    (2016)
  • ElazizM.A. et al.

    Task scheduling in cloud computing based on hybrid moth search algorithm and differential evolution

    Knowl.-Based Syst.

    (2019)
  • FaustoF. et al.

    A global optimization algorithm inspired in the behavior of selfish herds

    Biosystems

    (2017)
  • ZaitounN.M. et al.

    Survey on image segmentation techniques

    Procedia Comput. Sci.

    (2015)
  • AkayB.

    A study on particle swarm optimization and artificial bee colony algorithms for multilevel thresholding

    Appl. Soft Comput.

    (2013)
  • Cited by (98)

    View all citing articles on Scopus
    View full text