Global optimization with one-class classification-assisted selection

https://doi.org/10.1016/j.swevo.2020.100801Get rights and content

Abstract

Selection in evolutionary algorithms (EAs) selects promising solutions from a set of candidates. Most selection strategies are fitness-driven, where each solution is selected based on its fitness value. This fitness-based strategy leads to a waste of fitness evaluations, since some unpromising solutions are thrown away without providing valuable search information while still being evaluated. Our intention is to reduce the number of fitness evaluations during the selection procedure. We treat selection as a one-class classification procedure so that the offspring solutions similar to the current population, which has the best solutions up to that point, are more likely to be selected. We use the classifier to predict the categories of newly generated solutions. Only those predicted ‘promising’ solutions can be selected for evaluation purposes. The efficiency of EAs can be greatly improved, since the procedure is based on the decision variables (features) before the fitness evaluations. Based on this consideration, we propose a One-class Classification-assisted Selection (OCAS) strategy for EAs. We apply the OCAS strategy to two EAs and study them on three test suites. Our experimental results reveal that the number of fitness evaluations can be clearly decreased by OCAS.

Section snippets

Introduction and motivation

Evolutionary algorithms (EAs) [1] are a kind of heuristic optimization method [2]. EAs have attracted much attention for solving optimization problems [3] due to their advantage of achieving global optimum and almost-all-of-the-best-optimal-results. The main framework of an EA always includes three main components: (1) initialization, which initializes the original population of an optimization problem; (2) reproduction, which generates new trial solutions from the current population; and (3)

Related work

To save the number of FEs of EAs, especially when the FE is computationally expensive, researchers turn to other methods to assist in selection [13]. The widely used ones are surrogate (meta) model-based approaches (SAEAs) [13], [14], [15], [16], [17]. These algorithms build the models to approximate the original optimization problems. Then the model’s predicted estimated fitness values of solutions are used for selection instead of the true values. In this manner, SAEAs are able to reduce the

Our proposed one-class classification-assisted selection strategy

We formally define and formulate the optimization problem in this paper as follows:minxΩf(x),where x=(x1,,xn)T is an n-dimensional decision variable vector; Ω is the feasible region of the search space, and f:RnR is the objective function.

As introduced in Section 1, a variety of EAs have been proposed to solve the optimization problem in Eq. (1). However, the efficiency of most EAs is still up for debate because there is a large number of FEs during the search for optimal results. We,

Experimental study

This section studies the performance of the proposed OCAS-EA approach. We first provide the experimental settings in Section 4.1. In Section 4.2, we compare the performance of OCAS-assisted EDA/LS and CoDE algorithms (denoted as OCAS-EDA/LS, OCAS-CoDE), with the original EDA/LS and CoDE. Section 4.3 studies the sensitivity of the OCAS strategy to the size of training dataset on OCAS-EDA/LS. Section 4.4 studies the sensitivity of the OCAS strategy to the kernel of SVM on OCAS-EDA/LS. Section 4.5

Conclusion and future work

We propose a one-class classification-assisted selection (OCAS) strategy for improving the performance of EAs. In the illustration of OCAS, first, the current population is defined as the positive training dataset. Next, the defined positive training dataset is used to build a one-class classifier. Then, the built classifier is used to predict the quality of the newly generated offspring solutions. Since the predicted unpromising solutions are thrown away before the fitness evaluation, the

CRediT authorship contribution statement

Jinyuan Zhang: Conceptualization, Data curation, Formal analysis, Investigation, Writing - original draft. Jimmy Xiangji Huang: Conceptualization, Formal analysis, Investigation, Writing - review & editing. Qinmin Vivian Hu: Conceptualization, Formal analysis, Investigation, Writing - review & editing.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgement

We gratefully appreciate the associate editor and all the four reviewers for their excellent comments that greatly helped to improve the quality of the article. This research is supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada, the York Research Chairs (YRC) program and an ORF-RE (Ontario Research Fund-Research Excellence) award in BRAIN Alliance. All the work is done when the first author is a postdoc fellow sponsored by an NSERC discovery grant.

References (52)

  • Z. Li et al.

    Evolution strategies for continuous optimization: asurvey of the state-of-the-art

    Swarm Evol. Comput.

    (2020)
  • W. Feng et al.

    Mining network data for intrusion detection through combining SVMS with ant colony networks

    Future Gener. Comput. Syst.

    (2014)
  • B. Thomas et al.

    Handbook of Evolutionary Computation

    (1997)
  • D. Simon

    Evolutionary Optimization Algorithms

    (2013)
  • M. Tabatabaei et al.

    A survey on handling computationally expensive multiobjective optimization problems using surrogates: non-nature inspired methods

    Struct. Multidiscip. Optim.

    (2015)
  • S.S. Khan et al.

    One-class classification: taxonomy of study and review of techniques

    Knowl. Eng. Rev.

    (2013)
  • X. Yao et al.

    Evolutionary programming made faster

    IEEE Trans. Evol. Comput.

    (1999)
  • B. Liu et al.

    A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems

    IEEE Trans. Evol. Comput.

    (2014)
  • J. Liang et al.

    Problem Definitions and Evaluation Criteria for the CEC 2015 Competition on Learning-based Real-parameter Single Objective Optimization

    Technical Report201411A, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report

    (2014)
  • A. Zhou et al.

    An estimation of distribution algorithm with cheap and expensive local search methods

    IEEE Trans. Evol. Comput.

    (2015)
  • Y. Wang et al.

    Differential evolution with composite trial vector generation strategies and control parameters

    IEEE Trans. Evol. Comput.

    (2011)
  • R. Cheng et al.

    Model-based evolutionary algorithms: a short survey

    Complex Intell. Syst.

    (2018)
  • Y. Jin

    A comprehensive survey of fitness approximation in evolutionary computation

    Soft Comput.

    (2003)
  • A. Diaz-Manriquez et al.

    A review of surrogate assisted multiobjective evolutionary algorithms

    Comput. Intell. Neurosci.

    (2016)
  • T. Miquelez et al.

    Evolutionary computations based on Bayesian classifiers

    Int. J. Appl. Math.Comput. Sci.

    (2004)
  • T. Miquelez et al.

    Combining Bayesian classifiers and estimation of distribution algorithms for optimization in continuous domains

    Connect. Sci.

    (2007)
  • Cited by (2)

    • A survey on state of art approaches in handling imbalance, positive and unlabelled data

      2022, 3rd International Conference on Power, Energy, Control and Transmission Systems, ICPECTS 2022 - Proceedings
    View full text