Abstract
Extracting knowledge is the multidisciplinary process of identifying novel, significant, potentially useful, and consistent information in data. One of the most interesting techniques in the fields of extracting knowledge and machine learning are the self-organization maps (SOMs). They have the capacity of mapping complex high-dimensional relations onto a reduced lattice preserving the topological organization of the initial data. On the other hand, Evolutionary approaches provide an effective alternative to solve complex optimization problems in different application domains. One important characteristic in the application of evolutionary methods to real-world problems is its high demand for function evaluations before obtaining a satisfying solution. In their operation, evolutionary techniques produce new solutions without extracting useful knowledge from a large number of solutions already generated. The use of acquired knowledge during the evolution process could significantly improve their performance in conducting the search strategy toward promising regions or increasing its convergence properties. This paper introduces an evolutionary optimization algorithm in which knowledge extracted during its operation is employed to guide its search strategy. In the approach, a SOM is used as extracting knowledge technique to identify the promising areas through the reduction of the search space. Therefore, in each generation, the proposed method uses a subset of the complete group of generated solutions seen so-far to train the SOM. Once trained, the neural unit from the SOM lattice that corresponds to the best solution is identified. Then, by using local information of this neural unit an entire population of candidate solutions is produced. With the use of the extracted knowledge, the new approach improves the convergence to difficult high multi-modal optima by using a reduced number of function evaluations. The performance of our approach is compared to several state-of-the-art optimization techniques considering a set of well-known functions and three real-world engineering problems. The results validate that the introduced method reaches the best balance regarding accuracy and computational cost over its counterparts.













Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Han J, Kamber M (2001) Data mining: concepts and techniques. Morgan Kaufmann, San Francisco
Manco G, Rullo P, Gallucci L, Paturzo M (2016) Rialto: a knowledge discovery suite for data analysis. Expert Syst Appl 59:145–164
Kohonen T (1982) Self-organized formation of topologically correct feature maps. Biol Cybern 43:59–69
Qi Z, Yan W, Ping J, Xinyu S, Seung-Kyum C, Jiexiang H, Longchao C, Xiangzheng M (2017) An active learning radial basis function modeling method based on self-organization maps for simulation-based design problems. Knowl-Based Syst 131:10–27
Delgado S, Higuera C, Calle-Espinosa J, Morán F, Montero F (2017) A SOM prototype-based cluster analysis methodology. Expert Syst Appl 88:14–28
Ayodeji A, Evgeny M, Gorban N (2016) SOM: Stochastic initialization versus principal components. Inf Sci 364–365:213–221
Jagannath Nanda S, Panda G (2014) A survey on nature inspired metaheuristic algorithms for partitional clustering. Swarm Evolut Comput 16:1–18
Kennedy J, Eberhart R, Particle swarm optimization, in Proceedings of the 1995 IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948
Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. TechnicalReport-TR06. Engineering Faculty, Computer Engineering Department, Erciyes University,
Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony search. Simulations 76:60–68
Yang XS (2010) A new metaheuristic bat-inspired algorithm. In: Cruz C, González J, Krasnogor GTN, Pelta DA (eds) Nature inspired cooperative strategies for optimization (NISCO 2010), studies in computational intelligence, vol 284. Springer, Berlin, pp 65–74
Yang XS, Firefly algorithms for multimodal optimization, In: Stochastic algorithms: foundations and applications, SAGA 2009, Lecture Notes in Computer Sciences, vol. 5792, 2009, pp. 169–178
Cuevas E, Cienfuegos M, Zaldívar D, Pérez-Cisneros M (2013) A swarm optimization algorithm inspired in the behavior of the social-spider. Expert Syst Appl 40(16):6374–6384
Cuevas E, González M, Zaldivar D, Pérez-Cisneros M, García G (2012) An algorithm for global optimization inspired by collective animal behaviour. Discrete Dyn Nat Soc, art. no. 638275
Storn R, Price K (1995) Differential Evolution—a simple and efficient adaptive scheme for global optimisation over continuous spaces. Technical Report TR-95–012, ICSI, Berkeley, CA
Goldberg DE (1989) Genetic algorithm in search optimization and machine learning. Addison-Wesley, Boston
Yang X-S, Deb S (2009) Cuckoo search via Lévy flights, Proc. World Congr. Nat. Biol. Inspired Computation (NABIC’09), pp. 210–214
Civicioglu P (2012) Transforming geocentric cartesian coordinates to geodetic coordinates by using differential search algorithm. Comput Geosci 46:229–247
Mirjalili S (2015) Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl-Based Syst 89:228–249
Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput Appl 27(2):495–513
Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl-Based Syst 96:120–133
Nikolaus H, Kern S (2004) Evaluating the CMA evolution strategy on multimodal test functions. PPSN
Giraldez R, Aguilar-Ruiz JS, Riquelme JC (2005) Knowledge-based fast evaluation for evolutionary learning. IEEE Trans Cybern Part C 35(2):254–261
Kobeaga G, Merino M, Lozano J (2018) An efficient evolutionary algorithm for the orienteering problem. Comput Oper Res 90:42–59
Thomsen R, Fogel GB, Krink T (2002) A clustal alignment improver using evolutionary algorithms. Proc. 4th Congr. Evolutionary Computation (CEC’2002), vol. 1, 121–126,
Wang S, Wang L (2015) A knowledge-based multi-agent evolutionary algorithm for semiconductor final testing scheduling problem. Knowl-Based Syst 84:1–9
Deveci M, Çetin N (2018) Evolutionary algorithms for solving the airline crew pairing problem. Comput Ind Eng 115:389–406
Mobin M, Mohsen S, Komaki M, Tavana M (2018) A hybrid desirability function approach for tuning parameters in evolutionary optimization algorithms. Measurements 114:417–427
Agasiev T, Karpenko A (2017) The program system for automated parameter tuning of optimization algorithms. Proc Comput Sci 103:347–354
E.Yeguas MV, Luzón RPavón, Laza R, Arroyo G, Díaz F (2014) Automatic parameter tuning for evolutionary algorithms using a bayesian case-based reasoning system. Appl Soft Comput 18:185–195
Nobile MS, Cazzaniga P, Besozzi D, Colombo R, Mauri G, Pasi G (2018) Fuzzy self-tuning PSO: a settings-free algorithm for global optimization. Swarm Evolut Comput 39:70–85
Elsayed S, Sarker R, Coello-Coello C, Ray T (2018) Adaptation of operators and continuous control parameters in differential evolution for constrained optimization. Soft Comput 22(19):6595–6616
Hong L, Drake JH, Woodward JR, Özcan E (2018) A hyper-heuristic approach to automated generation of mutation operators for evolutionary programming. Appl Soft Comput 62:162–175
Hu Z, Yang J, Sun H, Wei L, Zhao Z (2017) An improved multi-objective evolutionary algorithm based on environmental and history information. Neurocomputing 222:170–182
Coello C, Landa R, Adding knowledge and efficient data structures to evolutionary programming: a cultural algorithm for constrained optimization, Proceeding GECCO’02 Proceedings of the 4th Annual Conference on Genetic and Evolutionary Computation, (2002), 201–209
Haykin S (1999) Neural networks: a comprehensive foundation. Prentice Hall, New York
Anescu N, Further scalable test functions for multidimensional continuous optimization, no. 2017
Li MD, Zhao H, Weng XW, Han T (2016) A novel nature-inspired algorithm for optimization: virus colony search. Adv Eng Softw 92:65–88
Yang XS, Wiley InterScience (2010) Engineering optimization: an introduction with metaheuristic applications. Wiley, Hoboken (Online service)
Askarzadeh A (2016) A novel metaheuristic method for solving constrained engineering optimization problems: crow search algorithm. Comput Struct 169:1–12
Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci (Ny) 179(13):2232–2248
Yu S, Zhu S, Ma Y, Mao D (2015) A variable step size firefly algorithm for numerical optimization. Appl Math Comput 263:214–220
Yang X-S, Karamanoglu M, He X (2014) Flower pollination algorithm: a novel approach for multiobjective optimization. Eng Optim 46(9):1222–1237
Wilcoxon F (1945) Individual comparisons by ranking methods. Biometrics, pp. 80–83,
Liang JJ, Qu B-Y, Suganthan PN, Problem definitions and evaluation criteria for the CEC 2015 special session and competition on single objective realparameter numerical optimization, Technical Report 201311, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Nanyang Technological University, Singapore, 2017
Olorunda O, Engelbrecht A, Measuring exploration/exploitation in particle swarms using swarm diversity, in: Proceedings of the IEEE Congress on Evolutionary Computation, Hong Kong, 2008, pp. 1128–1134
Mortazavi A, Toğan V, Nuhoğlu A (2018) Interactive search algorithm: a new hybrid metaheuristic optimization algorithm. Eng Appl Artif Intell 71:275–292
Ong Y, Nair P, Keane A (2003) Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA J 41(4):687–696
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that there is no conflict of interests regarding the publication of this paper.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Cuevas, E., Galvez, J. An optimization algorithm guided by a machine learning approach. Int. J. Mach. Learn. & Cyber. 10, 2963–2991 (2019). https://doi.org/10.1007/s13042-018-00915-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-018-00915-0