Elsevier

Information Sciences

Volume 517, May 2020, Pages 18-36
Information Sciences

Manifold learning for efficient gravitational search algorithm

https://doi.org/10.1016/j.ins.2019.12.047Get rights and content

Highlights

  • Modified version of the Gravitational Search Algorithm (GSA).

  • We incorporate manifold learning to evaluate more fitted magnitude for the attraction forces.

  • We incorporate elitism.

  • We benchmark a large set of functions to show the performance of the algorithm compared to the state-of-the-art metaheuristic optimization algorithms.

  • The comparative study show that the improved method has advantages in terms of the final cost value and rate of convergence.

Abstract

Metaheuristic algorithms provide a practical tool for optimization in a high-dimensional search space. Some mimic phenomenons of nature such as swarms and flocks. Prominent one is the Gravitational Search Algorithm (GSA) inspired by Newton’s law of gravity to manipulate agents modeled as point masses in the search space. The law of gravity states that interaction forces are inversely proportional to the squared distance in the Euclidean space between two objects. In this paper we claim that when the set of solutions lies in a lower-dimensional manifold, the Euclidean distance would yield unfitted forces and bias in the results, thus causing suboptimal and slower convergence. We propose to modify the algorithm and utilize geodesic distances gained through manifold learning via diffusion maps. In addition, we incorporate elitism by storing exploration data. We show the high performance of this approach in terms of the final solution value and the rate of convergence compared to other meta-heuristic algorithms including the original GSA. In this paper we also provide a comparative analysis of the state-of-the-art optimization algorithms on a large set of standard benchmark functions.

Introduction

In many fields, there is an increasing need for solutions to high-dimensional real-life optimization problems. Optimization is required in many domains such as in finding the right design parameters for multi-objective power distribution feeder while considering several criteria [27], searching for high-energy particles [1], and dynamic locomotion [14]. One powerful branch of optimization algorithms that has considerably grown in the past two decades is known as metaheuristic algorithms. These algorithms try to model physical or biological processes inspired by various tasks such as hunting, defence, navigation, foraging and lowering energy levels, which inherently solve high-dimensional optimization problems.

Previous research [6] has established that there are two main attributes to metaheuristic algorithms. The first aspect is stochastic behavior. Deterministic solvers find the same nearby local optimum for the same initial starting points. In contrast, metaheuristics incorporate randomness and thus, capable of avoiding getting trapped in local optima in the search for a global solution. However, in some particular cases they can still get trapped. Hence, their overall performance and applicability are reduced. Therefore, the second fundamental property for effectively solving high-dimensional optimization problems is the right balance between exploration and exploitation. Exploration enables the search for a global solution while acquiring more information. Exploitation uses current knowledge and performs local search for finding the optimum around good solutions [8].

Early metaheuristic algorithms include the Genetic Algorithm (GA) inspired by Darwins theory [15], [26], Simulated Annealing (SA) based on thermodynamic laws [19] and the Particle Swarm Optimization (PSO) inspired by animal flocks such as birds and fish [5], [18]. More recent algorithms are the Bacteria Foraging Optimization (BFO) mimicking the way bacteria search for nutrients [28], the Ant Lion Optimizer (ALO) which mimics the hunting mechanism of Antlions in nature [24]. Similarly, the Gravitational Search Algorithm (GSA) [30] and its different variations [10], [13] are inspired by the laws of gravity and motion. Additional metaheuristic algorithms can be reviewed in [11], [16], [17], [25], [34].

A more related metaheuristic algorithm could be reviewed in [23]. The algorithm, termed the Gray Wolf Optimizer (GWO), employs the idea of feature selection for exploring the data to eliminate irrelevant and redundant data while searching for the optimal solution. A classification accuracy-based fitness function was proposed, to explore regions of the complex search space. The author compares this algorithm with the PSO and GA over a set of benchmark machine learning data repository. Dimensionality reduction is also used in [22] to boost the performance of a Gaussian process model.

The standard GSA [30] uses the Euclidean space to calculate forces between agents based on Newton’s law of gravity. However, in many problems, the set of solutions for the optimization problem lies in a lower dimensional subspace embedded in the ambient search space. Using a Euclidean metric may result in improper forces, may bias results, and trap or delay agents in local optima. Therefore and inspired by GSA, in this work we present a modified algorithm termed Curved Space Gravitational Search Algorithm (CSGSA). We propose working in the lower-dimensional subspace learned by an unsupervised machine learning algorithm. Through manifold learning we find the geodesic distances across the solution subspace which offer more fitted forces between agents.

Some semi-supervised algorithms also make similar assumptions, as in the our manifold learning approach, of a smooth manifold structure in the data. For example, Chapelle et al. [7] use this concept to find middle ground between having all training set labeled and no labels at all. Similarly, Belkin and Niyogi [3] assume that a high dimensional dataset used for a classification problem actually resided on a lower-dimensional manifold. The work suggests utilizing this data structure to overcome the tedious task of class labeling. However, there is no link between our work on meta-heuristic search algorithms to classification. Nevertheless, the fair assumption that real life high dimensional problems lie on a lower-dimnesional submanifold holds. The same idea can be reviewed in [31].

Along with the manifold learning, we add elitism to the algorithm by incorporating a simple memory-based approach. Consequently, this work provides an important opportunity to advance the understanding of combining two domains: metaheuristic algorithms and unsupervised learning. To support our contributions, we performed an extensive comparative study comparing our proposed algorithm to the state-of-the-art, GSA, ALO and PSO. The comparison is done through a large set of standard benchmarking functions, allowing effective analysis. The comparative analysis also provides an insight into the performance of prominent metaheuristic algorithms in various functions and may assist in future choices of algorithms. We note that we do not perform complexity comparison between the algorithms since complexity analysis for such meta-heuristic optimization algorithms simply do not exist due to their stochastic nature.

The paper is organized as follows. Section 2 introduces the original GSA algorithm and address cases in which the algorithm can bias the results. In Section 3, the CSGSA and its characteristics are described. Section 4 provides a brief overview of two state-of-the-art approaches to be used in the comparative study of Section 5.

Section snippets

Gravitational search algorithm (GSA)

A continuous parameter maximization problem for a given objective function of the form f:XR with XRn, is defined as finding some global optimum x*X such that for any xX, f (x*) ≥ f (x) is satisfied. We note that without loss of generality, in the theoretic discussion we refer to a maximization problem. Nevertheless, a minimization problem can easily be addressed. Next, we present the original GSA algorithm as proposed in [30] followed by a discussion of cases in which the algorithm fails to

Curved space gravitational search algorithm

In this Section we introduce our modifications for the GSA to address the issues discussed in the previous section.

Metaheuristic state-of-the-art algorithms

In the comparative study of Section 5, we compare the CSGSA to the Particle Swarm Optimization (PSO), to the Ant Lion Optimization (ALO) and to the original GSA. PSO and ALO are briefly reviewed in this section prior to the comparative analysis.

Experimental results

In this section we evaluate the performance of the proposed algorithm and provide a comparative analysis. The CSGSA is compared to ALO, PSO and the original GSA. In our analysis, we pay attention to two important attributes of the performance: the final fitness value and the rate of convergence. For that matter, the algorithms are benchmarked on a set of standard functions presented in Section 5.1 followed by a comparative study in Section 5.2.

Conclusions

In this work we modified the original GSA by improving the information transfer through manifold learning and incorporated elitism. The manifold learning using Diffusion maps enabled the acquisition of more fitted forces between agents. The performance of the proposed algorithm was benchmarked and compared to other state-of-the-art algorithms over 47 test functions. The results show that the proposed algorithm can find better optima in many of the benchmark functions. Moreover, the manifold

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References (34)

  • C. Blum et al.

    Metaheuristics in combinatorial optimization: overview and conceptual comparison

    ACM Comput. Surv. (CSUR)

    (2003)
  • O. Chapelle et al.

    Semi-Supervised Learning

    (2010)
  • J. Chen et al.

    Optimal contraction theorem for exploration-exploitation tradeoff in search and optimization

    IEEE Trans. Syst. Man Cybernet. - Part A

    (2009)
  • M. Doraghinejad et al.

    Black hole: a new operator for gravitational search algorithm

    Int. J. Comput. Intell. Syst.

    (2014)
  • M. Dorigo et al.

    Ant system: optimization by a colony of cooperating agents

    IEEE Trans. Syst. Man Cybernet. Part B (Cybernetics)

    (1996)
  • S. Gepshtein et al.

    Image completion by diffusion maps and spectral relaxation

    IEEE Trans. Image Process.

    (2013)
  • U. Güvenç et al.

    Escape velocity: a new operator for gravitational search algorithm

    Neural Comput. Appl.

    (2017)
  • Cited by (12)

    • An improved gravitational search algorithm combining with centripetal force

      2022, Partial Differential Equations in Applied Mathematics
      Citation Excerpt :

      Danilo et al. 19 proposed a hyperbolic gravity search algorithm, which can find the optimal balance between exploration and exploitation. Chen and Avishai20 believe that when the solution set is located on a low-dimensional manifold, the Euclidean distance will produce non-fitting forces and deviations, which will lead to a slower convergence rate. Therefore, they improved GSA and replace the Euclidean distance with the geodesic distance obtained.

    • Semi-supervised classification by graph p-Laplacian convolutional networks

      2021, Information Sciences
      Citation Excerpt :

      Therefore, the data information representation methods [1], especially the manifold structure information of data, has become an important research topic for machine learning. The goal of manifold learning [2] is to discover low-dimensional manifold structure from high-dimensional sampled data, i.e. it learns low-dimensional manifold in high-dimensional space, and then finds corresponding embedding mapping relationships to achieve data visualization. The related algorithms of manifold learning (ML) have an important research significance in theory and applications including machine learning [3], data mining [4], and computer vision [5].

    View all citing articles on Scopus
    View full text