Skip to main content
Log in

Automatic selection of hidden neurons and weights in neural networks for data classification using hybrid particle swarm optimization, multi-verse optimization based on Lévy flight

  • Research Paper
  • Published:
Evolutionary Intelligence Aims and scope Submit manuscript

Abstract

Within neural networks, it is considered a difficult task to find optimal values for the number of hidden neurons and connection weights simultaneously. This is because altering the hidden neurons significantly affects a neural network’s structure and increases the difficulty of the training process which needs special considerations. Particle swarm optimization (PSO) is one of the most important metaheuristic algorithms due to its convergence speed and its simplicity of implementation. Multi-verse optimization (MVO) based on Lévy flight is a recent and fast algorithm and can avoid premature convergence and can achieve a better balance between exploration and exploitation. This paper presents a new training method based on hybrid particle swarm optimization with Multi-verse optimization based on Lévy flight (PLMVO) to optimize the number of hidden neurons and connection weights simultaneously in feedforward neural networks (FFNN). The hybrid algorithm is utilized to search better in solution space which proves its efficiency in reducing the problems of trapping in local minima. To evaluate the proposed algorithm we used three experimental series. In the first one, the proposed PLMVO algorithm is compared with the MVO and PSO algorithms to solve a set of 15 benchmark functions to find the global solution. Meanwhile, in the second experiment, the performance of the proposed approach was compared with five evolutionary techniques and the standard momentum backpropagation and adaptive learning rate. The comparison was benchmarked and evaluated using nine bio-medical datasets. The results of the comparative study show that PLMVO outperformed other training methods in most datasets and can be an alternative to other training methods. In the third experiment, the proposed PLMVO-MLP is used to predict malicious executable Linux files. The implemented model achieved very promising results with very high accuracy of 1.0, and an average f-measure of 1.0.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. http://archive.ics.uci.edu/ml/.

  2. https://www.kaggle.com/datasets/.

References

  1. Faris H, Aljarah I, Mirjalili S (2016) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45(2):322–332

    Article  Google Scholar 

  2. Faris H, Mirjalili S, Aljarah I (2019) Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme. Int J Mach Learn Cybern 10(10):2901–2920

    Article  Google Scholar 

  3. Zhang J-R, Zhang J, Lok T-M, Lyu MR (2007) A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training. Appl Math Comput 185(2):1026–1037

    Article  Google Scholar 

  4. Kiranyaz S, Ince T, Yildirim A, Gabbouj M (2009) Evolutionary artificial neural networks by multi-dimensional particle swarm optimization. Neural Netw 22(10):1448–1462

    Article  Google Scholar 

  5. Raidl GR (2006) A unified view on hybrid metaheuristics. In: Hybrid metaheuristics lecture notes in computer science, pp 1–12

  6. Kenter T, Borisov A, Gysel CV, Dehghani M, Rijke MD, Mitra B (2018) Neural networks for information retrieval. In: Proceedings of the eleventh ACM international conference on web search and data mining—WSDM 18 (2018)

  7. Wang L, Li Y, Huang J, Lazebnik S (2019) Learning two-branch neural networks for image-text matching tasks. IEEE Trans Pattern Anal Mach Intell 41(2):394–407

    Article  Google Scholar 

  8. Blum C, Puchinger J, Raidl GR, Roli A (2011) Hybrid metaheuristics in combinatorial optimization: A survey. Appl Soft Comput 11(6):4135–4151

    Article  Google Scholar 

  9. Martıi R, El-Fallahi A (2004) Multilayer neural networks: an experimental evaluation of on-line training methods. Comput Oper Res 31(9):1491–1513

    Article  Google Scholar 

  10. Aljarah I, Faris H, Mirjalili S, Al-Madi N (2016) Training radial basis function networks using biogeography-based optimizer. Neural Comput Appl 29(7):529–553

    Article  Google Scholar 

  11. Ludermir T, Yamazaki A, Zanchettin C (2006) An optimization methodology for neural network weights and architectures. IEEE Trans Neural Netw 17(6):1452–1459

    Article  Google Scholar 

  12. Palmes P, Hayasaka T, Usui S (2005) Mutation-based genetic neural network. IEEE Trans Neural Netw 16(3):587–600

    Article  Google Scholar 

  13. Montana DJ, Davis L (1989) Training feedforward neural networks using genetic algorithms. In: Proceedings of the 11th international joint conference on artificial intelligence—volume 1. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, IJCAI’89, pp 762–767

  14. Chen H, Wang S, Li J, Li Y (2007) A hybrid of artificial fish swarm algorithm and particle swarm optimization for feedforward neural network training. In: Proceedings on intelligent systems and knowledge engineering (ISKE2007)

  15. Nandy S (2012) Training a feed-forward neural network with artificial bee colony based backpropagation method. Int J Comput Sci Inform Technol 4(4):33–46

    Google Scholar 

  16. Nawi NM, Rehman MZ, Khan A (2014) A new bat based back-propagation (BAT-BP) algorithm, Advances in intelligent systems and computing advances in systems science, pp 395–404

  17. Jaddi NS, Abdullah S, Hamdan AR (2015) Optimization of neural network model using modified bat-inspired algorithm. Appl Soft Comput 37:71–86

    Article  Google Scholar 

  18. Faris H, Aljarah I, Mirjalili S (2017) Evolving radial basis function networks using Moth–Flame optimizer. In: Handbook of neural computation, pp 537–550 (2017)

  19. Faris H, Aljarah I, Mirjalili S (2017) Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl Intell 48(2):445–464

    Article  Google Scholar 

  20. Aljarah I, Faris H, Mirjalili S (2016) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22(1):1–15

    Article  Google Scholar 

  21. Khan A, Shah R, Imran M, Khan A, Bangash JI, Shah K (2019) An alternative approach to neural network training based on hybrid bio meta-heuristic algorithm. J Ambient Intell Hum Comput 10(10):3821–3830

    Article  Google Scholar 

  22. Hassanin MF, Shoeb AM, Hassanien AE (2016) Grey wolf optimizer-based back-propagation neural network algorithm. In: 2016 12th international computer engineering conference (ICENCO)

  23. Mirjalili S, Mirjalili SM, Lewis A (2014) Let a biogeography-based optimizer train your multi-layer perceptron. Inf Sci 269:188–209

    Article  MathSciNet  Google Scholar 

  24. Tarkhaneh O, Shen H (2019) Training of feedforward neural networks for data classification using hybrid particle swarm optimization, Mantegna Lévy flight and neighborhood search. Heliyon 5(4)

  25. Yu J, Xi L, Wang S (2007) An improved particle swarm optimization for evolving feedforward artificial neural networks. Neural Process Lett 26(3):217–231

    Article  Google Scholar 

  26. Zhao L, Qian F (2011) Tuning the structure and parameters of a neural network using cooperative binary-real particle swarm optimization. Expert Syst. Appl 38(5):4972–4977

    Article  Google Scholar 

  27. Mirjalili S, Mirjalili SM, Hatamlou A (2015) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput Appl 27(2):495–513

    Article  Google Scholar 

  28. Singh H, Mehta S, Prashar S (2016) Economic load dispatch using multi verse optimization. Int J Eng Res Sci 6(2):2395–6992

    Google Scholar 

  29. Faris H, Hassonah MA, Al-Zoubi AM, Mirjalili S, Aljarah I (2017) A multi-verse optimizer approach for feature selection and optimizing SVM parameters based on a robust system architecture. Neural Comput Appl 30(8):2355–2369

    Article  Google Scholar 

  30. Jia H, Peng X, Song W, Lang C, Xing Z, Sun K (2019) Multiverse optimization algorithm based on Lévy flight improvement for multithreshold color image segmentation. IEEE Access 7:32805–32844. https://doi.org/10.1109/access.2019.2903345

    Article  Google Scholar 

  31. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks. ICNN.1995.488968 IV, pp 1942–1948

  32. Mendes R, Cortez P, Rocha M, Neves J (2002) Particle swarms for feedforward neural network training. In: Proceedings of the 2002 international joint conference on neural networks. IJCNN02 (Cat. No.02CH37290) (2002)

  33. Sagarika TR (2015) Tunning of PSO algorithm for single machine and multi machine power system using STATCOM controller. Int J Eng Technol 2(4):175–182

    Google Scholar 

  34. Karthikeyan K, Dhal PK (2015) Transient stability enhancement by optimal location and tuning of STATCOM using PSO. Proc Technol

  35. Jangir P, Parmar SA, Trivedi IN, Bhesdadiya RH (2017) A novel hybrid Particle Swarm Optimizer with multi verse optimizer for global numerical optimization and Optimal Reactive Power Dispatch problem. Eng Sci Technol Int J 20(2):570–586

    Google Scholar 

  36. Ibrahim R, Ewees A, Oliva D, Abd Elaziz M, Lu S (2018) Improved salp swarm algorithm based on particle swarm optimization for feature selection. J Ambient Intell Hum Comput 10(8):3155–3169. https://doi.org/10.1007/s12652-018-1031-9

    Article  Google Scholar 

  37. Schultz M, Eskin E, Zadok F, Stolfo S (2001) Data mining methods for detection of new malicious executables. In: Proceedings of the 2001 IEEE symposium on security and privacy, pp 38–49

  38. Perdisci R, Lanzi A, Lee W (2008) Classification of packed executables for accurate computer virus detection. Pattern Recogn Lett 29:1941–1946

    Article  Google Scholar 

  39. Saxe J, Berlin K (2015) Deep neural network based malware detection using two dimensional binary program features. In: 2015 10th international conference on malicious and unwanted software (MALWARE), pp 11–20. IEEE

  40. Rhode M, Burnap P, Jones K (2018) Early-stage malware prediction using recurrent neural networks. Comput Secur 77:578–594. https://doi.org/10.1016/j.cose.2018.05.010

    Article  Google Scholar 

  41. shivam7066, shivam7066/Early-Stage-Malware-Prediction-using-Deep-Learning, GitHub, 26-Apr-2019. [Online]. https://github.com/shivam7066/Early-Stage-Malware-Prediction-using-Deep-Learning. Accessed 15 Mar 2020

  42. StatCounter, Desktop operating system market share worldwide. http://gs.statcounter.com/os-market-share/desktop/worldwide

  43. Asmitha KA, Vinod P (2014) A machine learning approach for linux malware detection. In: 2014 international conference on issues and challenges in intelligent computing techniques (ICICT)

  44. Cozzi E, Graziano M, Fratantonio Y, Balzarotti D (2018) Understanding Linux Malware. In: 2018 IEEE symposium on security and privacy (SP)

  45. Padawan live [Online]. https://padawan.s3.eurecom.fr/. Accessed 20 Feb 2020

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rabab Bousmaha.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bousmaha, R., Hamou, R.M. & Amine, A. Automatic selection of hidden neurons and weights in neural networks for data classification using hybrid particle swarm optimization, multi-verse optimization based on Lévy flight. Evol. Intel. 15, 1695–1714 (2022). https://doi.org/10.1007/s12065-021-00579-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12065-021-00579-w

Keywords

Navigation