Non-parametric particle swarm optimization for global optimization
Graphical abstract
Introduction
PSO [1] is a population-based algorithm inspired by the social behavior of bird flocking or fish schooling. In the algorithm, a member in the swarm, particle, represents a potential solution which is a point in the search space. The global optimum is regarded as the location of food. Each particle adjusts its flying direction according to the best experiences obtained by itself and the swarm in the solution space. The algorithm has a simple concept and is easy to implement. Hence, it has received much more attention to solve real-world optimization problems [2], [3], [4], [5], [6], [7], nevertheless, PSO may easily get trapped in local optima and shows a slow convergence rate when solving the complex and high dimensional multimodal objective functions [8].
A number of variant PSO algorithms have been proposed in the literature to overcome the problems. The algorithms have improved the performance of PSO in different ways using various types of topologies, selecting parameters, combining with other search techniques and so on.
A local (ring) topological structure PSO (LPSO) [9] and Von Neumann topological structure PSO (VPSO) [10] were proposed by Kennedy and Mendes to avoid trapping into local optima. According to Kennedy [9], [11], PSO with a small neighborhood might have a better performance on complex problems, while PSO with a large neighborhood would perform better on simple problems. Suganthan [12] applied a dynamically adjusted neighborhood where the neighborhood of a particle gradually increases until it includes all particles. Dynamic multi-swarm PSO (DMS-PSO) [13] was suggested by Liang and Suganthan where the neighborhood of a particle gradually increases until it includes all particles. Hu and Eberhart [14] applied a dynamic neighborhood where m nearest particles in the performance space is chosen to be its new neighborhood in each generation. Mendes et al. [15] presented the fully informed particle swarm (FIPS) algorithm that uses the information of entire neighborhood to guide the particles for finding the best solution. Parsopoulos and Vrahatis combined the global and local versions together to form the unified particle swarm optimizer (UPSO) [16]. Gao et al. [17] used PSO with a stochastic search technique and chaotic opposition-based population initialization to solve complex multimodal problems. The algorithm, CSPSO, finds new solutions in the neighborhoods of the previous best positions to escape from local optima.
The fitness-distance-ratio-based PSO (FDR-PSO) was introduced by Peram et al. [18]. In the algorithm, each particle moves toward nearby particle with higher fitness value. Liang et al. [8] developed comprehensive learning particle swarm optimization (CLPSO) that focused on avoiding the local optima by encouraging each particle to learn its behavior from other particles on different dimensions.
In another research, a selection operator was firstly proposed for PSO by Angeline [19]. Other researchers applied apart from crossover [20], and mutation [21] operations from GA into PSO. An adaptive fuzzy particle swarm optimization (AFPSO) [22] proposed to adjust the parameters in PSO based on fuzzy inferences.
Beheshti et al. proposed the median-oriented PSO (MPSO) [23] based on the information from the median particle. Also, they introduced centripetal accelerated PSO (CAPSO) [24] according to Newton's laws of motion to accelerate the learning procedure and convergence rate of optimization problems. Other variant PSO algorithms have been recently developed based on different techniques [25], [26], [27], [28].
Although the aforementioned algorithms have obtained satisfactory results in many optimization problems; there are still some disadvantages. For example, LPSO presents a slow convergence rate in unimodal functions [23], [24]. CLPSO is not a good choice for solving unimodal problems [8]. Also, the majority of the algorithms require several parameters to tune, and setting the parameters can be a challenging for optimization problems. Moreover, some of the algorithms have a better performance than the PSO but their structures are not as simple as PSO.
To overcome the drawbacks, this study introduces a non-parametric particle swarm optimization (NP-PSO) algorithm. The proposed method performs a global and local search over the search space with a fast convergence speed using two quadratic interpolation operations. There is no need to tune any algorithmic parameter in the NP-PSO algorithm. It means that all PSO parameters are removed in the proposed algorithm.
The remainder of this study is organized as follows. In Section 2, a brief overview of PSO is provided. The proposed algorithm, NP-PSO in more details is described in Section 3. In Section 4, NP-PSO is used to solve several unimodal and multimodal benchmark functions and its performance is compared with some PSO algorithms in the literature. Finally, conclusions and further research directions are presented in Section 5.
Section snippets
Particle swarm optimization (PSO)
PSO is a population-based meta-heuristic algorithm that applies two approaches of global exploration and local exploitation to find the optimum solution. The exploration is the ability of expanding search space, where the exploitation is the ability of finding the optima around a good solution. The algorithm is initialized by creating a swarm, i.e., population of particles (N), with random positions. Every particle is shown as a vector, , in a D-dimensional search space where
NP-PSO – The proposed method
NP-PSO tends to overcome the disadvantages of PSO by avoiding local optima, accelerating the convergence speed and removing algorithmic parameters. According to [23], [24], PSO has shown a better performance than LPSO in unimodal problems and LPSO provides a good results in multimodal. Hence, both local and global topologies are applied in NP-PSO. Also, the search of new area is improved by creating new particles in different areas. In the algorithm, each particle uses the best position found
Experimental results and discussion
In this section, the proposed NP-PSO algorithm is compared with six well-known PSO algorithms in the literature. The performance of algorithms is evaluated by various unimodal and multimodal functions in different dimensions. Nineteen (19) benchmark functions [35], [36], [37] are selected in this study.
Conclusions and future research
This study presents a non-parametric PSO algorithm (NP-PSO) to improve the search ability and the convergent efficiency of PSO. The algorithm does not use any algorithmic parameter. It is simple and easy to implement as the original PSO. The method enhances the global exploration and the local exploitation using both the local and global topologies, and two quadratic interpolation operations. The new strategies increase the particles’ abilities to fly in the larger potential space. To evaluate
Acknowledgements
The authors thank the Research Management Center (RMC), Universiti Teknologi Malaysia (UTM) for supporting in R&D, and Soft Computing Research Group (SCRG), Universiti Teknologi Malaysia (UTM), Johor Bahru, Malaysia for the inspiration and moral support in conducting this research. Also, they hereby would like to appreciate the post-doctoral program, Universiti Teknologi Malaysia (UTM), for the financial support and research activities. This work is supported by The Ministry of Higher Education
References (39)
- et al.
Handling boundary constraints for particle swarm optimization in high-dimensional search space
Inform. Sci.
(2011) - et al.
An improvement in RBF learning algorithm based on PSO for real time applications
Neurocomputing
(2013) - et al.
Example-based learning particle swarm optimization for continuous optimization
Inform. Sci.
(2012) - et al.
Niching particle swarm optimization with local search for multi-modal optimization
Inform. Sci.
(2012) - et al.
A memetic particle swarm optimization algorithm for multimodal optimization problems
Inform. Sci.
(2012) - et al.
Particle swarm optimization with chaotic opposition-based population initialization and stochastic search technique
Commun. Nonlinear Sci. Numer. Simul.
(2012) - et al.
Hybrid particle swarm optimization with mutation for optimizing industrial product lines: an application to a mixed solution space considering both discrete and continuous design variables
Ind. Market. Manage.
(2013) - et al.
Adaptive fuzzy particle swarm optimization for global optimization of multimodal functions
Inform. Sci.
(2011) - et al.
MPSO: median-oriented particle swarm optimization
Appl. Math. Comput.
(2013) - et al.
CAPSO. Centripetal accelerated particle swarm optimization
Inform. Sci.
(2014)
Two-layer particle swarm optimization for unconstrained optimization problems
Appl. Soft Comput.
Integrated learning particle swarm optimizer for global optimization
Appl. Soft Comput.
Re-evaluating genetic algorithm performance under coordinate rotation of benchmark functions
Biosystems
Particle swarm optimization
Enhancement of artificial neural network learning using centripetal accelerated particle swarm optimization for medical diseases diagnosis
Soft Comput.
Comprehensive learning particle swarm optimizer for global optimization of multimodal functions
IEEE Trans. Evol. Comput.
Population structure and particle swarm performance
Neighborhood topologies in fully informed and best-of-neighborhood particle swarms
IEEE Trans. Syst. Man Cybernet. Part C
Small worlds and mega-minds: effects of neighborhood topology on particle swarm performance
Cited by (64)
Impact of population topology on particle swarm optimization and its variants: An information propagation perspective
2022, Swarm and Evolutionary ComputationA novel digital watermarking scheme using dragonfly optimizer in transform domain
2021, Computers and Electrical EngineeringA time-varying mirrored S-shaped transfer function for binary particle swarm optimization
2020, Information SciencesImproved wolf pack algorithm based on adaptive step size and Levy flight strategy
2023, Chongqing Daxue Xuebao/Journal of Chongqing UniversityTask offloading for edge computing in industrial Internet with joint data compression and security protection
2023, Journal of Supercomputing