Abstract
Artificial neural networks (ANNs) have shown efficient results in statistics and computer science applications. Feed-forward neural network (FNN) is the most popular and simplest neural network architecture, capable of solving nonlinearity. In this paper, feed-forward neural networks’ weight and bias figuring using a newly proposed metaheuristic Salp Swarm Algorithm (SSA) are proposed. SSA is a swarm-based metaheuristic inspired by the navigating and foraging behaviour of salp swarm. The performance is evaluated for some of the benchmarked datasets and compared with some well-known metaheuristics.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Hertz J.: Introduction to the theory of neural computation. Basic Books 1 (1991)
Rumelhart, D.E., Williams, R.J., Hinton, G.E.: Learning internal representations by error propagation. Parallel Distributed Process.: Explorations Microstruct. Cognition 1, 318–362 (1986)
Glover, F.W., Kochenberger, G.A. (eds.) Handbook of metaheuristics, vol. 57. Springer Science & Business Media (2006)
Gao, Q., Lei, K.Q.Y., He, Z.: An Improved Genetic Algorithm and Its Application in Artificial Neural Network, Information, Communications and Signal Processing, 2005. In: Fifth International Conference on, December 06–09, pp. 357–360 (2005)
Tsai, J.T., Chou, J.H., Liu, T.K.: Tuning the structure and parameters of a neural network by using hybrid taguchi-genetic algorithm. IEEE Trans. Neural Networks 17(1) (2006)
Pavlidis, N.G., Tasoulis, D.K., Plagianakos, V.P., Nikiforidis, G., Vrahatis, M.N.: Spiking Neural Network Training Using Evolutionary Algorithms, Neural Networks, 2005. In: IJCNN 05. Proceedings 2005 IEEE International Joint Conference, vol. 4, pp. 2190–2194 (2005)
Mendes, R., Cortez, P., Rocha, M., Neves, J.: Particle swarm for feedforward neural network training. In: Proceedings of the International Joint Conference on Neural Networks, vol. 2, pp. 1895–1899 (2002)
Meissner, M., Schmuker, M., Schneider, G.: Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training. BMC Bioin- Formatics 7, 125 (2006)
Fan, H., Lampinen, J.: A trigonometric mutation operation to differential evolution. J. Global Optim. 27, 105–129 (2003)
Slowik, A., Bialko, M.: Training of artificial neural networks using differential evolution algorithm. Human System Interactions, pp. 60–65 (2008)
Blum, C., Socha, K.: Training feed-forward neural networks with ant colony optimization: an application to pattern classification. In: 5th international conference on, Hybrid Intelligent Systems, 2005. HIS05, p. 6 (2005)
Socha, K., Blum, C.: An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training. Neural Comput. Appl. 16, 235–247 (2007)
Ozturk, C., Karaboga, D.: Hybrid Artificial Bee Colony algorithm for neural network training. In: 2011 IEEE Congress on, Evolutionary Computation (CEC), pp. 84–88 (2011)
Mirjalili, S.: How effective is the Grey Wolf optimizer in training multi-layer perceptrons. Appl. Intell. 43(1), 150–161 (2015)
Mirjalili, S., Gandomi, A.H., Mirjalili, S.Z., Saremi, S., Faris, H., Mirjalili, S.M.: Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv. Eng. Software (2017)
Olorunda, O., & Engelbrecht, A.P.: Measuring exploration/exploitation in particle swarms using swarm diversity. In: Evolutionary Computation, 2008. CEC 2008. (IEEE World Congress on Computational Intelligence). IEEE Congress on (pp. 1128–1134). IEEE (2008, June)
Alba, E., Dorronsoro, B.: The exploration/exploitation tradeoff in dynamic cellular genetic algorithms. IEEE Trans. Evolutionary Comput. 9(2), 126–142 (2005)
Crepinsek, M., Liu, S.H., Mernik, M.: Exploration and exploitation in evolutionary algorithms: a survey. ACM Comput. Surveys (CSUR) 45(3), 35 (2013)
Blake, C., Merz, C.J.: UCI Repository of machine learning databases (1998)
Mirjalili, S., Mirjalili, S.M., Lewis, A.: Let a biogeography-based optimizer train your multi-layer perceptron. Inf. Sci. 269, 188–209 (2014)
Demar, J.: Statistical comparisons of classifiers over multiple data sets. J. Machine Learning Res. 7(Jan), 1–30 (2006)
Garca, S., Fernndez, A., Luengo, J., Herrera, F.: Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf. Sci. 180(10), 2044–2064 (2010)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Bairathi, D., Gopalani, D. (2019). Salp Swarm Algorithm (SSA) for Training Feed-Forward Neural Networks. In: Bansal, J., Das, K., Nagar, A., Deep, K., Ojha, A. (eds) Soft Computing for Problem Solving. Advances in Intelligent Systems and Computing, vol 816. Springer, Singapore. https://doi.org/10.1007/978-981-13-1592-3_41
Download citation
DOI: https://doi.org/10.1007/978-981-13-1592-3_41
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-1591-6
Online ISBN: 978-981-13-1592-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)