Skip to main content
Log in

Oppositional salp swarm algorithm with mutation operator for global optimization and application in training higher order neural networks

  • 1166: Advances of machine learning in data analytics and visual information processing
  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Effectiveness of any swarm based metaheuristic optimization algorithm focuses on perfect mishmash of operator’s castoff for exploration and exploitation. The absenteeism of balance between this two factors leads to deprived performance in terms of attaining global optimum by stagnating in local optimum and untimely convergence. Salp Swarm Algorithm (SSA) is a recently evolved optimization technique, intended to resolve continuous, non-linear and multifaceted real world optimization glitches. For solving complex day to day life problems the explorative strength of existing SSA is not adequate. So, this paper proposes a new improved algorithm termed as OBL-MO-SSA to enhance the performance of existing SSA. Two techniques such as normal distributed mutation operator and oppositional learning concept is embedded to achieve the purpose. Oppositional learning concept ensures the current as well as opposite candidate solutions in the search region simultaneously to evaluate the closer solutions during ongoing evolution process. Mutation operator avoids the arbitrary positions in the search region by choosing lesser and larger mutations for balanced motion in current and opposite directions. The proposed method OBL-MO-SSA improves the exploration and exploitation strength inside search region at the same time exhibiting better convergence speed by successfully avoiding local optima stagnation. To confirm the efficiency of proposed OBL-MO-SSA algorithm, the same is assessed by benchmark problems pertaining to IEEE-CEC-2017. The competence and strength of the proposed OBL-MO-SSA is characterised by using performance metrics, complexity analysis, convergence rate and statistical significance. Friedman and Holms test has been accomplished to substantiate its statistical significance. Furthermore to elucidate complex difficulties, the proposed method used to train higher order neural network (FLANN) by the help of 10 customary datasets preferred from UCI storehouse. The simulated outcomes reveals that the developed OBL-MO-SSA might be cast-off for resolving various optimization complications efficiently.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Abusnaina AA, Ahmad S, Jarrar R, Mafarja M (2018) Training neural networks using salp swarm algorithm for pattern classification. In Proceedings of the 2nd International Conference on Future Networks and Distributed Systems (p. 17). ACM

  2. Ahmed I, Guan D, Chung TC (2014) Sms classification based on naive bayes classifier and apriori algorithm frequent itemset. Int J mach Learn comput 4(2):183

    Article  Google Scholar 

  3. Ahmed S, Mafarja M, Faris H, Aljarah I (2018) Feature selection using salp swarm algorithm with chaos. In Proceedings of the 2nd International Conference on Intelligent Systems, Metaheuristics & Swarm Intelligence (pp. 65-69). ACM

  4. Asaithambi S, Rajappa M (2018) Swarm intelligence-based approach for optimal design of CMOS differential amplifier and comparator circuit using a hybrid salp swarm algorithm. Review of Scientific Instruments 89(5):054702

    Article  Google Scholar 

  5. Bache K, Lichman M (2013) UCI machine learning repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California. School of information and computer science, 28

  6. Can U, Alatas B (2015) Physics based metaheuristic algorithms for global optimization. American Journal of Information Science and Computer Engineering 1(3):94–106

    Google Scholar 

  7. Chakri A, Khelif R, Benouaret M, Yang XS (2017) New directional bat algorithm for continuous optimization problems. Expert Syst Appl 69:159–175

    Article  Google Scholar 

  8. da Silva Ferreira A, da Silva Santos CH, Gonçalves MS, Figueroa HEH (2018) Towards an integrated evolutionary strategy and artificial neural network computational tool for designing photonic coupler devices. Appl Soft Comput 65:1–11

    Article  Google Scholar 

  9. Dhiman G, Kumar V (2017) Spotted hyena optimizer: a novel bio-inspired based metaheuristic technique for engineering applications. Adv Eng Softw 114:48–70

    Article  Google Scholar 

  10. Dinkar SK, Deep K (2017) Opposition based Laplacian ant lion optimizer. Journal of computational science, 23, pp.71-90.Pappula, L. and Ghosh, D., 2018. Cat swarm optimization with normal mutation for fast convergence of multimodal functions. Appl Soft Comput 66:473–491

    Google Scholar 

  11. Dinkar SK, Deep K (2018) Accelerated opposition-based Antlion optimizer with application to order reduction of linear time-invariant systems. Arabian Journal for Science and Engineering, pp.1-29

  12. Ekinci S, Hekimoglu B (2018) Parameter optimization of power system stabilizer via Salp swarm algorithm. In 2018 5th International Conference on Electrical and Electronic Engineering (ICEEE) (pp. 143-147). IEEE

  13. F Distribution Table (2018) Retrieved from http://www.socr.ucla.edu/applets.dir/f_table.html.

  14. Ghareb AS, Bakar AA, Hamdan AR (2016) Hybrid feature selection based on enhanced genetic algorithm for text categorization. Expert Syst Appl 49:31–47

    Article  Google Scholar 

  15. Green RC II, Wang L, Alam M (2012) Training neural networks using central force optimization and particle swarm optimization: insights and comparisons. Expert Syst Appl 39(1):555–563

    Article  Google Scholar 

  16. Han X, Quan L, Xiong X, Almeter M, Xiang J, Lan Y (2017) A novel data clustering algorithm based on modified gravitational search algorithm. Eng Appl Artif Intell 61:1–7

    Article  Google Scholar 

  17. Hegazy AE, Makhlouf MA, El-Tawel GS (2018) Improved salp swarm algorithm for feature selection. Journal of King Saud University-Computer and Information Sciences

  18. Hong L, Drake JH, Woodward JR, Özcan E (2018) A hyper-heuristic approach to automated generation of mutation operators for evolutionary programming. Appl Soft Comput 62:162–175

    Article  Google Scholar 

  19. Ibrahim HT, Mazher WJ, Ucan ON, Bayat O (2017) Feature selection using Salp swarm algorithm for real biomedical datasets. IJCSNS 17(12):13

    Google Scholar 

  20. Jordehi AR (2014) A chaotic-based big bang–big crunch algorithm for solving global optimisation problems. Neural Comput & Applic 25(6):1329–1335

    Article  Google Scholar 

  21. Karaboga D, Basturk B (2007) Artificial bee colony (ABC) optimization algorithm for solving constrained optimization problems. In International fuzzy systems association world congress (pp. 789-798). Springer, Berlin, Heidelberg

  22. Mafarja MM, Mirjalili S (2017) Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312

    Article  Google Scholar 

  23. Meraihi Y, Ramdane-Cherif A, Mahseur M, Achelia D (2018) A chaotic binary Salp swarm algorithm for solving the graph coloring problem. In International Symposium on Modelling and Implementation of Complex Systems (pp. 106-118). Springer, Cham.

  24. Mirjalili S (2015) Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl-Based Syst 89:228–249

    Article  Google Scholar 

  25. Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw 83:80–98

    Article  Google Scholar 

  26. Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl-Based Syst 96:120–133

    Article  Google Scholar 

  27. Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67

    Article  Google Scholar 

  28. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61

    Article  Google Scholar 

  29. Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Faris H, Mirjalili SM (2017) Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv Eng Softw 114:163–191

    Article  Google Scholar 

  30. Misra BB, Dehuri S (2007) Functional link artificial neural network for classification task in data mining

  31. Nag K, Pal NR (2016) A multiobjective genetic programming-based ensemble for simultaneous feature selection and classification. IEEE transactions on cybernetics 46(2):499–510

    Article  Google Scholar 

  32. Normal Distribution Table. (n.d.) Retrieved from http://math.arizona.edu/~rsims/ma464/standardnormaltable.pdf.

  33. Panda N, Majhi SK (2019) How effective is spotted hyena optimizer for training multilayer Perceptrons. Int J Recent Technol Eng, 4915-4927.

  34. Panda N, Majhi SK (2019) Improved Salp swarm algorithm with space transformation search for training neural network. Arabian Journal for Science and Engineering, pp. 1-19

  35. Panda N, Majhi SK (2020) How effective is the Salp swarm algorithm in data classification. In Computational Intelligence in Pattern Recognition (pp. 579-588). Springer, Singapore

  36. Panda N, Majhi SK (n.d.) Improved spotted hyena optimizer with space transformational search for training pi-sigma higher order neural network. Comput Intell

  37. Panda N, Majhi SK, Singh S, Khanna A (2020) Oppositional spotted hyena optimizer with mutation operator for global optimization and application in training wavelet neural network. Journal of Intelligent & Fuzzy Systems, (preprint), pp.1-14.

  38. Pappula L, Ghosh D (2018) Cat swarm optimization with normal mutation for fast convergence of multimodal functions. Appl Soft Comput 66:473–491

    Article  Google Scholar 

  39. Patnana N, Pattnaik S, Singh VP (2018) Salp swarm optimization based PID controller tuning for Doha reverse osmosis desalination plant. Int J Pure Appl Math 119(12):12707–12720

    Google Scholar 

  40. Rizk-Allah RM, Hassanien AE, Elhoseny M, Gunasekaran M (2018) A new binary salp swarm algorithm: development and application for optimization tasks. Neural Comput Appl, 1–23

  41. Saghatforoush A, Monjezi M, Faradonbeh RS, Armaghani DJ (2016) Combination of neural network and ant colony optimization algorithms for prediction and optimization of flyrock and back-break induced by blasting. Eng Comput 32(2):255–266

    Article  Google Scholar 

  42. Schmidt DP, Rutland CJ (2000) A new droplet collision algorithm. J Comput Phys 164(1):62–80

    Article  Google Scholar 

  43. Sicilia JA, Quemada C, Royo B, Escuín D (2016) An optimization algorithm for solving the rich vehicle routing problem based on variable neighborhood search and Tabu search metaheuristics. J Comput Appl Math 291:468–477

    Article  MathSciNet  Google Scholar 

  44. Ting CK, Liaw RT, Wang TC, Hong TP (2018) Mining fuzzy association rules using a memetic algorithm based on structure representation. Memetic Computing 10(1):15–28

    Article  Google Scholar 

  45. Tsamardinos I, Brown LE, Aliferis CF (2006) The max-min hill-climbing Bayesian network structure learning algorithm. Mach Learn 65(1):31–78

    Article  Google Scholar 

  46. Wang H, Wu Z, Liu Y, Wang J, Jiang D, Chen L (2009) Space transformation search: a new evolutionary technique. In Proceedings of the first ACM/SIGEVO Summit on Genetic and Evolutionary Computation (pp. 537-544). ACM

  47. Wang D, Tan D, Liu L (2018) Particle swarm optimization algorithm: an overview. Soft Comput 22(2):387–408

    Article  Google Scholar 

  48. Wang J, Gao Y, Chen X (2018) A novel hybrid interval prediction approach based on modified lower upper bound estimation in combination with multi-objective salp swarm algorithm for short-term load forecasting. Energies 11(6):1561

    Article  Google Scholar 

  49. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82

    Article  Google Scholar 

  50. Wu G, Mallipeddi R, Suganthan PN (2017) Problem definitions and evaluation criteria for the cec 2017 competition on constrained real-parameter optimization. National University of Defense Technology, Changsha, Hunan, PR China and Kyungpook National University, Daegu, South Korea and Nanyang Technological University, Singapore, Technical Report

  51. Wu G, Shen X, Li H, Chen H, Lin A, Suganthan PN (2018) Ensemble of differential evolution variants. Inf Sci 423:172–186

    Article  MathSciNet  Google Scholar 

  52. Yang XS (2012) Flower pollination algorithm for global optimization. In International conference on unconventional computing and natural computation (pp. 240-249). Springer, Berlin, Heidelberg.

  53. Yang XS (2013) Multiobjective firefly algorithm for continuous optimization. Eng Comput 29(2):175–184

    Article  Google Scholar 

  54. Yang XS, Deb S (2010) Engineering optimisation by cuckoo search. arXiv preprint arXiv:1005.2908

  55. Zhang L, Zhang B (1999) A geometrical representation of McCulloch-Pitts neural model and its applications. IEEE Trans Neural Netw 10(4):925–929

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nibedan Panda.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Panda, N., Majhi, S.K. Oppositional salp swarm algorithm with mutation operator for global optimization and application in training higher order neural networks. Multimed Tools Appl 80, 35415–35439 (2021). https://doi.org/10.1007/s11042-020-10304-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-020-10304-x

Keywords

Navigation