Abstract
In neural networks, finding optimal values for the number of hidden neurons and connection weights simultaneously is considered a challenging task. This is because altering the hidden neurons substantially impacts the entire structure of a neural network and increases the complexity of training process that requires special considerations. In fact, the number of variables changes proportional to the number of hidden nodes when training neural networks. As one of the seminal attempts, a hybrid encoding scheme is first proposed to deal with the aforementioned challenges. A set of recent and well-regarded stochastic population-based algorithms is then employed to optimize the number of hidden neurons and connection weights in a single hidden feedforward neural network (FFNN). In the experiments, twenty-three standard classification datasets are employed to benchmark the proposed technique qualitatively and quantitatively. The results show that the hybrid encoding scheme allows optimization algorithms to conveniently find the optimal values for both the number of hidden nodes and connection weights. Also, the recently proposed grey wolf optimizer (GWO) outperformed other algorithms.









Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Aljarah I, Ludwig SA (2013) A new clustering approach based on glowworm swarm optimization. In: Proceedings of 2013 IEEE congress on evolutionary computation conference, Cancun, Mexico, IEEE Xplore
Aljarah I, Faris H, Mirjalili S, Al-Madi N (2016) Training radial basis function networks using biogeography-based optimizer. Neural Comput Appl 29:1–25
Aljarah I, Faris H, Mirjalili S (2018) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22(1):1–15
Aljarah I, Faris H, Mirjalili S, Al-Madi N (2018) Training radial basis function networks using biogeography-based optimizer. Neural Comput Appl 29(7):529–553
Amiri M, Amnieh HB, Hasanipanah M, Khanli LM (2016) A new combination of artificial neural network and \(K\)-nearest neighbors models to predict blast-induced ground vibration and air-overpressure. Eng Comput 32:1–14
Armaghani DJ, Hasanipanah M, Mohamad ET (2016) A combination of the ICA-ANN model to predict air-overpressure resulting from blasting. Eng Comput 32(1):155–171
Bolaji AL, Ahmad AA, Shola PB (2016) Training of neural network for pattern classification using fireworks algorithm. Int J Syst Assur Eng Manag 9:1–8
Ding S, Li H, Chunyang S, Junzhao Y, Jin F (2013) Evolutionary artificial neural networks: a review. Artif Intell Rev 39(3):251–260
Dua D, Karra Taniskidou E (2017) UCI machine learning repository. University of California, School of Information and Computer Science, Irvine, CA. http://archive.ics.uci.edu/ml
Faris H, Aljarah I, Al-Madi N, Mirjalili S (2016) Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 25(06):1650033
Faris H, Aljarah I, Mirjalili S (2016) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45:1–11
Faris H, Sheta AF, Öznergiz E (2016) MGP–CC: a hybrid multigene GP–Cuckoo search method for hot rolling manufacture process modelling. Syst Sci Control Eng 4(1):39–49
Faris H, Aljarah I, Mirjalili S (2017) Evolving radial basis function networks using moth–flame optimizer. In: Samui P, Roy SS, Balas VE (eds) Handbook of neural computation. Elsevier, New York, pp 537–550
Faris H, Aljarah I, Mirjalili S (2018) Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl Intell 48(2):445–464
Faris H, Aljarah I, Al-Betar MA, Mirjalili S (2018) Grey wolf optimizer: a review of recent variants and applications. Neural comput Appl. https://doi.org/10.1007/s00521-017-3272-5
Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning, 1st edn. Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA
Gordan B, Armaghani DJ, Hajihassani M, Monjezi M (2016) Prediction of seismic slope stability through combination of particle swarm optimization and neural network. Eng Comput 32(1):85–97
Gupta S, Deep K (2019) A novel random walk grey wolf optimizer. Swarm Evol Comput 44:101–112
Gupta JND, Sexton RS (1999) Comparing backpropagation with a genetic algorithm for neural network training. Omega 27(6):679–684
Hagan MT, Menhaj MB (1994) Training feedforward networks with the Marquardt algorithm. IEEE Trans Neural Netw 5(6):989–993
Hasanipanah M, Noorian-Bidgoli M, Armaghani DJ, Khamesi H (2016) Feasibility of PSO-ANN model for predicting surface settlement caused by tunneling. Eng Comput 32:1–11
Hecht-Nielsen R (1987) Kolmogorov’s mapping neural network existence theorem. In: Proceedings of the international conference on neural networks, IEEE Press, New York, vol 3, pp 11–13
Heidari AA, Pahlavani P (2017) An efficient modified grey wolf optimizer with Iévy flight for optimization tasks. Appl Soft Comput 60:115–134
Holland JH (1992) Adaptation in natural and artificial systems. MIT Press, Cambridge
Hush DR (1989) Classification with neural networks: a performance analysis. In: Proceedings of the IEEE international conference on systems engineering, pp 277–280
Jianbo Y, Xi L, Wang S (2007) An improved particle swarm optimization for evolving feedforward artificial neural networks. Neural Process Lett 26(3):217–231
Jianbo Y, Wang S, Xi L (2008) Evolving artificial neural networks using an improved PSO and DPSO. Neurocomputing 71(4–6):1054–1060
Kaastra I, Boyd M (1996) Designing a neural network for forecasting financial and economic time series. Neurocomputing 10(3):215–236
Kanellopoulos I, Wilkinson GG (1997) Strategies and best practice for neural network image classification. Int J Remote Sens 18(4):711–725
Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Technical report, Technical report-TR06, Erciyes University, Engineering Faculty, Computer Engineering Department
Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In: Torra V, Narukawa Y, Yoshida Y (eds) Modeling decisions for artificial intelligence. Springer, Berlin. pp 318–329
Karaboga D, Gorkemli B, Ozturk C, Karaboga N (2014) A comprehensive survey: artificial bee colony (ABC) algorithm and applications. Artif Intell Rev 42(1):21–57
Kennedy J (2011) Particle swarm optimization. In: Sammut C, Webb GI (eds) Encyclopedia of machine learning. Springer, Berlin, pp 760–766
Kenter T, Borisov A, Van Gysel C, Dehghani M, de Rijke M, Mitra B (2018) Neural networks for information retrieval. In: Proceedings of the eleventh ACM international conference on web search and data mining, ACM, pp 779–780
Liu Z, Liu A, Wang C, Niu Z (2004) Evolving neural network using real coded genetic algorithm (GA) for multispectral image classification. Future Gener Comput Syst 20(7):1119–1129
Long W, Jiao J, Liang X, Tang M (2018) An exploration-enhanced grey wolf optimizer to solve high-dimensional numerical optimization. Eng Appl Artif Intell 68:63–80
Masters T (1993) Practical neural network recipes in C++. Morgan Kaufmann, Burlington
Meissner M, Schmuker M, Schneider G (2006) Optimized particle swarm optimization (OPSO) and its application to artificial neural network training. BMC Bioinform 7(1):125
Mendes R, Cortez P, Rocha M, Neves J (2002) Particle swarms for feedforward neural network training. In: Proceedings of the 2002 international joint conference on neural networks, 2002. IJCNN ’02, vol 2, pp 1895–1899
Mirjalili S (2015) How effective is the grey wolf optimizer in training multi-layer perceptrons. Appl Intell 43(1):150–161
Mirjalili S (2016) Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput Appl 27(4):1053–1073
Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl Based Syst 96:120–133
Mirjalili S, Hashim SZM, Sardroudi HM (2012) Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Appl Math Comput 218(22):11125–11137
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61
Mirjalili S, Mirjalili SM, Lewis A (2014) Let a biogeography-based optimizer train your multi-layer perceptron. Inf Sci 269:188–209
Paola JD (1994) Neural network classification of multispectral imagery. The University of Arizona, USA, Master Tezi
Reza Peyghami M, Khanduzi R (2013) Novel MLP neural network with hybrid Tabu search algorithm. Neural Netw World 23(3):255
Ripley BD (1993) Statistical aspects of neural networks. In: Networks and chaos: statistical and probabilistic aspects, vol 50, pp 40–123
Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536. https://doi.org/10.1038/323533a0
Seiffert U (2001) Multiple layer perceptron training using genetic algorithms. In: Proceedings of the European symposium on artificial neural networks, Bruges, Bélgica
Sexton RS, Gupta JND (2000) Comparative evaluation of genetic algorithm and backpropagation for training neural networks. Inf Sci 129(1):45–59
Sexton RS, Dorsey RE, Johnson JD (1999) Optimization of neural networks: a comparative analysis of the genetic algorithm and simulated annealing. Eur J Oper Res 114(3):589–601
Sharma S, Salgotra R, Singh U (2017) An enhanced grey wolf optimizer for numerical optimization. In: Innovations in information, embedded and communication systems (ICIIECS), 2017 international conference on, IEEE, pp 1–6
Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12(6):702–713
Tsai J-T, Chou J-H, Liu T-K (2006) Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm. IEEE Trans Neural Netw 17(1):69–80
Wang C (1994) A theory of generalization in learning machines with neural network applications. PhD thesis
Wang G-G , Guo L, Gandomi AH, Cao L, Alavi AH, Duan H, Li J (2013) Lévy-flight krill herd algorithm. Math Probl Eng 2013
Wang G-G, Gandomi AH, Alavi AH (2014) An effective krill herd algorithm with migration operator in biogeography-based optimization. Appl Math Model 38(9):2454–2462
Wang L, Li Y, Huang J, Lazebnik S (2018) Learning two-branch neural networks for image-text matching tasks. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1109/TPAMI.2018.2797921
Xu H, Liu X, Su J (2017) An improved grey wolf optimizer algorithm integrated with Cuckoo Search. In: 2017 9th IEEE international conference on intelligent data acquisition and advanced computing systems: technology and applications (IDAACS)
Zhang Y, Wang S, Ji G (2015) A comprehensive survey on particle swarm optimization algorithm and its applications. Math Probl Eng 1:32
Zhao L, Qian F (2011) Tuning the structure and parameters of a neural network using cooperative binary-real particle swarm optimization. Expert Syst Appl 38(5):4972–4977
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Faris, H., Mirjalili, S. & Aljarah, I. Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme. Int. J. Mach. Learn. & Cyber. 10, 2901–2920 (2019). https://doi.org/10.1007/s13042-018-00913-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-018-00913-2