Skip to main content

Advertisement

Log in

FMFO: Floating flame moth-flame optimization algorithm for training multi-layer perceptron classifier

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

As one of the most popular artificial neural networks, multi-layer perceptron (MLP) has been employed to solve classification problems in many applications. The main challenge in MLP application is finding the ideal set of network connection weights and biases in the training process, which minimizes the error of MLP in processing datasets. To efficiently address this challenge, numerous swarm intelligence (SI) algorithms with powerful search capabilities have been adopted for training MLP classifiers. However, these existing algorithms often suffer from problems of local optima stagnation, premature convergence, and inefficient search. In this study, a novel floating flame moth-flame optimization (FMFO) algorithm with remarkable exploitation and exploration search capabilities is proposed, offering an advantageous option for training MLP classifiers. To verify the performance of the proposed FMFO in training MLP classifiers, the FMFO-based MLP training approach (FMFO-MLP) is evaluated on eleven classification datasets that represent a wide range of the variable dimension scale. In addition, some recently developed well-known and state-of-the-art SI algorithms are applied to compare with the proposed FMFO. Experimental results demonstrate that the proposed FMFO outperforms the other competing algorithms in terms of approximating the optimal objective function value and achieving classification accuracy. Moreover, the proposed FMFO achieves a competitive computational efficiency in the experiment, confirming that it is an efficient optimizer for training MLP classifiers in practical applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Agrawal U, Arora J, Singh R, Gupta D, Khanna A, Khamparia A (2020) Hybrid wolf-bat algorithm for optimization of connection weights in multi-layer perceptron. ACM Trans Multimed Comput Commun Appl (TOMM) 16(1s):1–20

    Article  Google Scholar 

  2. Mirjalili S, Mirjalili SM, Lewis A (2014) Let a biogeography-based optimizer train your multi-layer perceptron. Inf Sci 269:188–209

    Article  MathSciNet  Google Scholar 

  3. Zhang J-R, Zhang Jun, Lok T-M, Lyu MR (2007) A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Appl Math Comput 185(2):1026–1037

    MATH  Google Scholar 

  4. Yang J, Hu Y, Zhang K, Wu Y (2021) An improved evolution algorithm using population competition genetic algorithm and self-correction bp neural network based on fitness landscape. Soft Comput 25 (3):1751–1776

    Article  Google Scholar 

  5. Xu Y, Chen H, Luo J, Zhang Q, Jiao S, Zhang X (2019) Enhanced moth-flame optimizer with mutation strategy for global optimization. Inf Sci 492:181–203

    Article  MathSciNet  Google Scholar 

  6. Pelusi D, Mascella R, Tallini L, Nayak J, Naik B, Deng Y (2020) An improved moth-flame optimization algorithm with hybrid search phase. Knowl-Based Syst 191:105277

    Article  Google Scholar 

  7. Ghorbani MA, Kazempour R, Chau K-W, Shamshirband S, Ghazvinei PT (2018) Forecasting pan evaporation with an integrated artificial neural network quantum-behaved particle swarm optimization model: a case study in talesh, northern iran. Eng Appl Comput Fluid Mech 12(1):724–737

    Google Scholar 

  8. Mirjalili S (2015) How effective is the grey wolf optimizer in training multi-layer perceptrons. Appl Intell 43(1):150–161

    Article  Google Scholar 

  9. Gupta S, Deep K, Mirjalili S, Kim JH (2020) A modified sine cosine algorithm with novel transition parameter and mutation operator for global optimization. Expert Syst Appl 154:113395

    Article  Google Scholar 

  10. Mousavirad SJ, Schaefer G, Korovin I, Oliva D (2021) Rde-op: A region-based differential evolution algorithm incorporation opposition-based learning for optimising the learning process of multi-layer neural networks. In: International Conference on the Applications of Evolutionary Computation (Part of EvoStar). Springer, pp 407–420

  11. Mirjalili S (2015) Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl-Based Syst 89:228–249

    Article  Google Scholar 

  12. Shehab M, Abualigah L, Hamad HA, Alabool H, Alshinwan M, Khasawneh AM (2020) Moth–flame optimization algorithm: variants and applications. Neural Comput Appl 32(14):9859–9884

    Article  Google Scholar 

  13. Ma L, Wang C, Xie N-g, Shi M, Ye Y, Lu W (2021) Moth-flame optimization algorithm based on diversity and mutation strategy. Appl Intell 51(8):5836–5872

    Article  Google Scholar 

  14. Lin G-Q, Li L-L, Tseng M-L, Liu H-M, Yuan D-D, Tan RR (2020) An improved moth-flame optimization algorithm for support vector machine prediction of photovoltaic power generation. J Clean Prod 119966:253

    Google Scholar 

  15. Hongwei L, Jianyong L, Liang C, Jingbo B, Yangyang S, Kai L (2019) Chaos-enhanced moth-flame optimization algorithm for global optimization. J Syst Eng Electron 30(6):1144–1159

    Article  Google Scholar 

  16. Anfal M, Abdelhafid H (2017) Optimal placement of pmus in algerian network using a hybrid particle swarm–moth flame optimizer (pso-mfo). Electroteh Electron Autom 65(3):191– 196

    Google Scholar 

  17. Elaziz MA, Ewees AA, Ibrahim RA, Lu S (2020) Opposition-based moth-flame optimization improved by differential evolution for feature selection. Math Comput Simul 168:48–75

    Article  MathSciNet  MATH  Google Scholar 

  18. Elaziz MA, Yousri D, Mirjalili S (2021) A hybrid harris hawks-moth-flame optimization algorithm including fractional-order chaos maps and evolutionary population dynamics. Adv Eng Softw 154:102973

    Article  Google Scholar 

  19. Sayed GI, Hassanien AE (2018) A hybrid sa-mfo algorithm for function optimization and engineering design problems. Compl Intell Syst 4(3):195–212

    Article  Google Scholar 

  20. Xu Y, Chen H, Heidari AA, Luo J, Zhang Q, Zhao X, Li C (2019) An efficient chaotic mutative moth-flame-inspired optimizer for global optimization tasks. Expert Syst Appl 129:135–155

    Article  Google Scholar 

  21. Khalilpourazari S, Khalilpourazary S (2019) An efficient hybrid algorithm based on water cycle and moth-flame optimization algorithms for solving numerical and constrained engineering optimization problems. Soft Comput 23(5):1699–1722

    Article  Google Scholar 

  22. Yu X, Wang Y, Liang J, Slowik A (2021) A self-adaptive mutation neural architecture search algorithm based on blocks. IEEE Comput Intell Mag 16(3):67–78

    Article  Google Scholar 

  23. O’Neill D, Xue B, Zhang M (2021) Evolutionary neural architecture search for high-dimensional skip-connection structures on densenet style networks. IEEE Transactions on Evolutionary Computation

  24. Yu X, Tang T, Liu AX (2019) Large-scale feedforward neural network optimization by a self-adaptive strategy and parameter based particle swarm optimization. IEEE Access 7:52473–52483

    Article  Google Scholar 

  25. Heidari AA, Faris H, Aljarah I, Mirjalili S (2019) An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput 23(17):7941–7958

    Article  Google Scholar 

  26. Mohammadi B, Guan Y, Moazenzadeh R, Safari MJS (2021) Implementation of hybrid particle swarm optimization-differential evolution algorithms coupled with multi-layer perceptron for suspended sediment load estimation. Catena 198:105024

    Article  Google Scholar 

  27. Yamany W, Fawzy M, Tharwat A, Hassanien AE (2015) Moth-flame optimization for training multi-layer perceptrons. In: 2015 11th International computer engineering Conference (ICENCO). IEEE, pp 267–272

  28. Gupta S, Deep K (2020) A novel hybrid sine cosine algorithm for global optimization and its application to train multilayer perceptrons. Appl Intell 50(4):993–1026

    Article  Google Scholar 

  29. Zhao R, Wang Y, Hu P, Jelodar H, Yuan C, Li Y, Masood I, Rabbani M (2019) Selfish herds optimization algorithm with orthogonal design and information update for training multi-layer perceptron neural network. Appl Intell 49(6):2339–2381

    Article  Google Scholar 

  30. Ghanem WAHM, Jantan A (2018) A cognitively inspired hybridization of artificial bee colony and dragonfly algorithms for training multi-layer perceptrons. Cogn Comput 10(6):1096–1134

    Article  Google Scholar 

  31. Bansal P, Kumar S, Pasrija S, Singh S (2020) A hybrid grasshopper and new cat swarm optimization algorithm for feature selection and optimization of multi-layer perceptron. Soft Comput 24(20):15463–15489

    Article  Google Scholar 

  32. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95-international conference on neural networks, vol 4. IEEE, pp 1942–1948

  33. Clerc M, Kennedy J (2002) The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Trans Evol Comput 6(1):58–73

    Article  Google Scholar 

  34. Yang Z, Shi K, Wu A, Qiu M, Hu Y (2019) A hybird method based on particle swarm optimization and moth-flame optimization. In: Proceedings of the 2019 11th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), vol 2. IEEE, pp 207–210

  35. Li C, Zhang N, Lai X, Zhou J, Xu Y (2017) Design of a fractional-order pid controller for a pumped storage unit using a gravitational search algorithm based on the cauchy and gaussian mutation. Inf Sci 396:162–181

    Article  Google Scholar 

  36. Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67

    Article  Google Scholar 

  37. Storn R, Price K (1997) Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341–359

    Article  MathSciNet  MATH  Google Scholar 

  38. Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47

    Article  Google Scholar 

  39. David A, Patrick M, Christopher M, Eamonn K, Cathy B, Seth H, David N (2021) Uci machine learning repository. Website. http://archive.ics.uci.edu/ml/index.php

  40. Wilcoxon F (1945) Individual comparisons by ranking methods. biometrics bulletin. Int Biometr Soc 1(6):80–83

    Google Scholar 

  41. Friedman M (1940) A comparison of alternative tests of significance for the problem of m rankings. Ann Math Stat 11(1):86–92

    Article  MathSciNet  MATH  Google Scholar 

  42. Yang Z, Wu A (2020) A non-revisiting quantum-behaved particle swarm optimization based multilevel thresholding for image segmentation. Neural Comput Appl 32(16):12011–12031

    Article  Google Scholar 

Download references

Acknowledgements

This study was funded by the Guangzhou Municipal Science and Technology Bureau of China (Research Grant no. 202002030133), the Guangzhou Municipal Education Bureau of China (Research Grant no. 201831785), the Department of Education of Guangdong Province of China (Research Grant no. 2019GKTSCX069), and the Guangzhou Panyu Polytechnic (Research Grant no. 2011Y05PY).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhenlun Yang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, Z. FMFO: Floating flame moth-flame optimization algorithm for training multi-layer perceptron classifier. Appl Intell 53, 251–271 (2023). https://doi.org/10.1007/s10489-022-03484-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03484-6

Keywords

Navigation