Manifold learning for efficient gravitational search algorithm
Introduction
In many fields, there is an increasing need for solutions to high-dimensional real-life optimization problems. Optimization is required in many domains such as in finding the right design parameters for multi-objective power distribution feeder while considering several criteria [27], searching for high-energy particles [1], and dynamic locomotion [14]. One powerful branch of optimization algorithms that has considerably grown in the past two decades is known as metaheuristic algorithms. These algorithms try to model physical or biological processes inspired by various tasks such as hunting, defence, navigation, foraging and lowering energy levels, which inherently solve high-dimensional optimization problems.
Previous research [6] has established that there are two main attributes to metaheuristic algorithms. The first aspect is stochastic behavior. Deterministic solvers find the same nearby local optimum for the same initial starting points. In contrast, metaheuristics incorporate randomness and thus, capable of avoiding getting trapped in local optima in the search for a global solution. However, in some particular cases they can still get trapped. Hence, their overall performance and applicability are reduced. Therefore, the second fundamental property for effectively solving high-dimensional optimization problems is the right balance between exploration and exploitation. Exploration enables the search for a global solution while acquiring more information. Exploitation uses current knowledge and performs local search for finding the optimum around good solutions [8].
Early metaheuristic algorithms include the Genetic Algorithm (GA) inspired by Darwins theory [15], [26], Simulated Annealing (SA) based on thermodynamic laws [19] and the Particle Swarm Optimization (PSO) inspired by animal flocks such as birds and fish [5], [18]. More recent algorithms are the Bacteria Foraging Optimization (BFO) mimicking the way bacteria search for nutrients [28], the Ant Lion Optimizer (ALO) which mimics the hunting mechanism of Antlions in nature [24]. Similarly, the Gravitational Search Algorithm (GSA) [30] and its different variations [10], [13] are inspired by the laws of gravity and motion. Additional metaheuristic algorithms can be reviewed in [11], [16], [17], [25], [34].
A more related metaheuristic algorithm could be reviewed in [23]. The algorithm, termed the Gray Wolf Optimizer (GWO), employs the idea of feature selection for exploring the data to eliminate irrelevant and redundant data while searching for the optimal solution. A classification accuracy-based fitness function was proposed, to explore regions of the complex search space. The author compares this algorithm with the PSO and GA over a set of benchmark machine learning data repository. Dimensionality reduction is also used in [22] to boost the performance of a Gaussian process model.
The standard GSA [30] uses the Euclidean space to calculate forces between agents based on Newton’s law of gravity. However, in many problems, the set of solutions for the optimization problem lies in a lower dimensional subspace embedded in the ambient search space. Using a Euclidean metric may result in improper forces, may bias results, and trap or delay agents in local optima. Therefore and inspired by GSA, in this work we present a modified algorithm termed Curved Space Gravitational Search Algorithm (CSGSA). We propose working in the lower-dimensional subspace learned by an unsupervised machine learning algorithm. Through manifold learning we find the geodesic distances across the solution subspace which offer more fitted forces between agents.
Some semi-supervised algorithms also make similar assumptions, as in the our manifold learning approach, of a smooth manifold structure in the data. For example, Chapelle et al. [7] use this concept to find middle ground between having all training set labeled and no labels at all. Similarly, Belkin and Niyogi [3] assume that a high dimensional dataset used for a classification problem actually resided on a lower-dimensional manifold. The work suggests utilizing this data structure to overcome the tedious task of class labeling. However, there is no link between our work on meta-heuristic search algorithms to classification. Nevertheless, the fair assumption that real life high dimensional problems lie on a lower-dimnesional submanifold holds. The same idea can be reviewed in [31].
Along with the manifold learning, we add elitism to the algorithm by incorporating a simple memory-based approach. Consequently, this work provides an important opportunity to advance the understanding of combining two domains: metaheuristic algorithms and unsupervised learning. To support our contributions, we performed an extensive comparative study comparing our proposed algorithm to the state-of-the-art, GSA, ALO and PSO. The comparison is done through a large set of standard benchmarking functions, allowing effective analysis. The comparative analysis also provides an insight into the performance of prominent metaheuristic algorithms in various functions and may assist in future choices of algorithms. We note that we do not perform complexity comparison between the algorithms since complexity analysis for such meta-heuristic optimization algorithms simply do not exist due to their stochastic nature.
The paper is organized as follows. Section 2 introduces the original GSA algorithm and address cases in which the algorithm can bias the results. In Section 3, the CSGSA and its characteristics are described. Section 4 provides a brief overview of two state-of-the-art approaches to be used in the comparative study of Section 5.
Section snippets
Gravitational search algorithm (GSA)
A continuous parameter maximization problem for a given objective function of the form with is defined as finding some global optimum such that for any f (x*) ≥ f (x) is satisfied. We note that without loss of generality, in the theoretic discussion we refer to a maximization problem. Nevertheless, a minimization problem can easily be addressed. Next, we present the original GSA algorithm as proposed in [30] followed by a discussion of cases in which the algorithm fails to
Curved space gravitational search algorithm
In this Section we introduce our modifications for the GSA to address the issues discussed in the previous section.
Metaheuristic state-of-the-art algorithms
In the comparative study of Section 5, we compare the CSGSA to the Particle Swarm Optimization (PSO), to the Ant Lion Optimization (ALO) and to the original GSA. PSO and ALO are briefly reviewed in this section prior to the comparative analysis.
Experimental results
In this section we evaluate the performance of the proposed algorithm and provide a comparative analysis. The CSGSA is compared to ALO, PSO and the original GSA. In our analysis, we pay attention to two important attributes of the performance: the final fitness value and the rate of convergence. For that matter, the algorithms are benchmarked on a set of standard functions presented in Section 5.1 followed by a comparative study in Section 5.2.
Conclusions
In this work we modified the original GSA by improving the information transfer through manifold learning and incorporated elitism. The manifold learning using Diffusion maps enabled the acquisition of more fitted forces between agents. The performance of the proposed algorithm was benchmarked and compared to other state-of-the-art algorithms over 47 test functions. The results show that the proposed algorithm can find better optima in many of the benchmark functions. Moreover, the manifold
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References (34)
- et al.
A study of particle swarm optimization particle trajectories
Inf. Sci.
(2006) - et al.
Diffusion maps
Appl. Comput. Harmon. Anal.
(2006) - et al.
Water evaporation optimization: a novel physically inspired optimization algorithm
Comput. Struct.
(2016) - et al.
Gray wolf optimizer for hyperspectral band selection
Appl. Soft Comput.
(2016) The ant lion optimizer
Adv. Eng. Softw.
(2015)- et al.
GSA: A gravitational search algorithm
Inf. Sci.
(2009) - et al.
Searching for exotic particles in high-energy physics with deep learning
Nature Commun.
(2014) - et al.
Laplacian eigenmaps for dimensionality reduction and data representation
Neural Comput.
(2003) - et al.
Semi-supervised learning on riemannian manifolds
Mach. Learn.
(2004) The theory of dynamic programming
Technical Report
(1954)
Metaheuristics in combinatorial optimization: overview and conceptual comparison
ACM Comput. Surv. (CSUR)
Semi-Supervised Learning
Optimal contraction theorem for exploration-exploitation tradeoff in search and optimization
IEEE Trans. Syst. Man Cybernet. - Part A
Black hole: a new operator for gravitational search algorithm
Int. J. Comput. Intell. Syst.
Ant system: optimization by a colony of cooperating agents
IEEE Trans. Syst. Man Cybernet. Part B (Cybernetics)
Image completion by diffusion maps and spectral relaxation
IEEE Trans. Image Process.
Escape velocity: a new operator for gravitational search algorithm
Neural Comput. Appl.
Cited by (12)
Locally informed gravitational search algorithm with hierarchical topological structure
2023, Engineering Applications of Artificial IntelligenceAn improved gravitational search algorithm combining with centripetal force
2022, Partial Differential Equations in Applied MathematicsCitation Excerpt :Danilo et al. 19 proposed a hyperbolic gravity search algorithm, which can find the optimal balance between exploration and exploitation. Chen and Avishai20 believe that when the solution set is located on a low-dimensional manifold, the Euclidean distance will produce non-fitting forces and deviations, which will lead to a slower convergence rate. Therefore, they improved GSA and replace the Euclidean distance with the geodesic distance obtained.
Analysis and improvement of GSA's optimization process
2021, Applied Soft ComputingSemi-supervised classification by graph p-Laplacian convolutional networks
2021, Information SciencesCitation Excerpt :Therefore, the data information representation methods [1], especially the manifold structure information of data, has become an important research topic for machine learning. The goal of manifold learning [2] is to discover low-dimensional manifold structure from high-dimensional sampled data, i.e. it learns low-dimensional manifold in high-dimensional space, and then finds corresponding embedding mapping relationships to achieve data visualization. The related algorithms of manifold learning (ML) have an important research significance in theory and applications including machine learning [3], data mining [4], and computer vision [5].
A novel integrated price and load forecasting method in smart grid environment based on multi-level structure
2020, Engineering Applications of Artificial Intelligence