Cluster structure prediction via revised particle-swarm optimization algorithm

https://doi.org/10.1016/j.cpc.2019.106945Get rights and content

Abstract

The global minimization of cluster structure at the atomic level is emerging as a state-of-the-art approach to accelerate the functionality-driven discovery of cluster-based materials. In this work, we have developed a method for global optimization of Lennard-Jones (LJ), elemental metal and bimetallic clusters through revised particle swarm optimization (RPSO) algorithm within random learning procedure, competition operation and confusion mechanism. The approach requires only size and chemical composition for a given cluster to predict stable or metastable structures at given external conditions. Random learning procedure significantly improves the performance of RPSO such as converge rate and optimization efficiency. Moreover, competition operation can ensure the superiority of outstanding individuals and accept the bad solution with a certain probability. Confusion mechanism allows the clusters to fly in a wider space and makes large-scale optimization possible. Results of optimization based on test functions show that the convergence of RPSO is much faster and more reliable than several other algorithms. In addition, RPSO can also perform well on optimization of Lennard-Jones clusters, Pt and Pt–Pd clusters with various sizes and compositions. The high success rate of RPSO demonstrates the reliability of this methodology and provides crucial insights for understanding the rich and complex structures of clusters.

Introduction

Cluster is one of important functional materials and has some important applications in various fields such as catalysis, optics, magnetism, electronics and biology  [1], [2], [3]. However, one of the important challenges in this field is how to determine the global structure of cluster, particularly when establishing a correspondence between material performance and its basic structure since properties of a cluster are intimately tied to its structure  [4], [5]. Experimentally, structural determination through X-ray diffraction technique has been developed extremely well, which can be used to study cluster structures. However, it happens frequently that experiments fail to determine structures due to the obtained low-quality X-ray diffraction data, particularly at extreme conditions. Therefore, the theoretical prediction of cluster structures with the only known information of size and chemical composition is greatly necessary [6]. However, it becomes increasingly difficult to do global minimization for larger size clusters, e.g. n > 150, due to the exponential increase of the computational cost with increasing the cluster size. For example, the number of local minimum is 103 for a LJ13 cluster but the number of local minimum for a LJ55 cluster is at least 109 times larger than that of LJ13 cluster [7]. To solve this problem, researchers have developed many optimization algorithms, such as genetic algorithm [8], [9], [10], [11], [12], particle swarm algorithm [6], [13], [14], [15], immune optimization algorithm [16], [17], basin-hopping algorithm [18], [19], [20], etc. For example, random sampling method seems to be “simple” in principle but nontrivial in practice and works well in many applications  [21], [22]. The genetic algorithm, GA, starts to use a self-improving method and thus is successful in predicting many high-pressure structures  [23], [24]. However, it is still necessary to develop new global optimization methods for cluster structure prediction, especially, for understanding the rich and complex structures of metal clusters.

Among these methods, particle swarm algorithm comes from complex adaptive system (CAS), making it very efficient for global optimization. CAS members called main body is adaptive and could communicate with environment and other individuals in the system. Main body changes its structure and behavior in the process of communication. Particle swarm optimization (PSO) algorithm [25] was first proposed in 1995 by Eberhart and Kennedy, and its basic concepts come from the study of birds foraging behavior. Unfortunately, basic PSO is easy to be trapped into a local minimal. To overcome this defect, an improved discrete PSO algorithm [15] was introduced in recent study but this algorithm is only a discrete algorithm, which can only be used to obtain positions of atoms in given stable configuration. Therefore, it is still necessary to improve the PSO algorithm to do global minimization of clusters with complex structures.

In this work, we develop revised particle swarm optimization (RPSO) algorithm within random learning procedure, competition operation and confusion mechanism to do global optimization of Lennard-Jones (LJ), elemental metal and bimetallic clusters. The convergence of RPSO on test functions compared with several other algorithms is discussed. In addition, the efficiency of the RPSO algorithm for the geometry optimization of Pt and Pt–Pd clusters with various sizes and compositions is also discussed. This paper is arranged as follows. In Section 2, the method and implementation of RPSO algorithm will be discussed in detail. A short overview of results obtained from our method is presented in Section 3 and the summary is given in Section 4.

Section snippets

Method details

RPSO is a heuristics and iterative algorithm. Each step of RPSO contains four independent parts, namely, evolution, local relaxation, competition and confusion. The main concept of RPSO is to construct a series of points X0,X1,X2,,Xn starting from some initial points X0, and then the sequence X0,X1,X2,,Xn converges to the global minimum of given multi-variable function FX. The procedure of constructing X0,X1,X2,,Xn is introduced in the following part.

Benchmark functions test

The optimization ability of RPSO is yet to be verified. Fifteen benchmark functions [27] are used for evaluating the performance of RPSO. The details of benchmark functions are listed in Table S1. As a comparison, we also implement genetic algorithm, basic PSO and RPSOLF [28] to optimize these functions. All test functions are minimization problems defined as following [27]: minfx,x=x1,x2,,xdTwhere d is the dimension of the function. Each algorithm runs 50 independent times and each time

Conclusion

In summary, we have developed a systemic methodology for global optimization of clusters based on revised particle swarm optimization algorithm within random learning procedure, competition operation and confusion mechanism. The approach requires only size and chemical composition for a given cluster to predict stable or metastable structures at given external conditions. In this developed method, random learning procedure significantly improves the performance of RPSO such as convergence rate

Acknowledgments

This work is supported by the National Natural Science Foundation of China (21822801, 21576008, 91634116) and the Fundamental Research Funds for the Central Universities, China (XK1802-1 and XK180301).

References (32)

  • WangY. et al.

    Comput. Phys. Comm.

    (2012)
  • ShaoG.F. et al.

    Comput. Phys. Comm.

    (2015)
  • ChengL. et al.

    Chem. Phys. Lett.

    (2004)
  • YanB. et al.

    Comput. Phys. Comm.

    (2017)
  • MartinT.P.

    Phys. Rep.

    (1996)
  • HutchingsG.J.

    Nature chem.

    (2010)
  • ZijlstraP. et al.

    Nature

    (2009)
  • GittinsD.I. et al.

    Nature

    (2000)
  • ZhangL. et al.

    Nature Rev. Mater.

    (2017)
  • FerrandoR. et al.

    Chem. Rev.

    (2008)
  • WangY. et al.

    Phys. Rev. B

    (2010)
  • BalettoF. et al.

    Rev. Modern Phys.

    (2005)
  • JohnstonRoy L.

    Dalton T

    (2003)
  • DarbyS. et al.

    J. Chem. Phys.

    (2002)
  • RapalloA. et al.

    J. Chem. Phys.

    (2005)
  • RossiG. et al.

    J. Chem. Phys.

    (2005)
  • Cited by (9)

    • Novel binary differential evolution algorithm for knapsack problems

      2021, Information Sciences
      Citation Excerpt :

      It has a long history of successfully solving continuous optimization problems and is considered one of the best EAs for handling those with real-valued variables because of its simple structure, robustness, speed and ease of use. Due to its powerful features, DE has been applied to solve a wide range of optimization problems in different areas, such as clustering [2], power control systems [3] and chemical engineering [4] as well as for simultaneous transit network design [5] and several other practical applications as reviewed in [6]. It is used to optimize a problem by iteratively trying to improve a candidate solution over several generations through adaptation, emergence and learning (evolutionary) operations.

    • Metaheuristic-based inverse design of materials – A survey

      2020, Journal of Materiomics
      Citation Excerpt :

      The results of comparative experiments on computer processing unit (CPU) and GPU implemented in the NVIDIA CUDA environment, showed that the GPU-based DPSO algorithm gave superior computational performance, and using Register rather than global memory affected the acceleration ratio more. To perform global optimization of clusters with complex structures, Zhou et al. [94] developed an algorithm called revised particle swarm optimization (RPSO). Gupta potential was adopted to predict the structure of metal clusters.

    View all citing articles on Scopus
    View full text