Skip to main content
Log in

AutoRWN: automatic construction and training of random weight networks using competitive swarm of agents

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Random Weight Networks have been extensively used in many applications in the last decade because it has many strong features such as fast learning and good generalization performance. Most of the traditional training techniques for Random Weight Networks randomly select the connection weights and hidden biases and thus suffer from local optima stagnation and degraded convergence. The literature shows that stochastic population-based optimization techniques are well regarded and reliable alternative for Random Weight Networks optimization because of high local optima avoidance and flexibility. In addition, many practitioners and non-expert users find it difficult to set the other parameters of the network like the number of hidden neurons, the activation function, and the regularization factor. In this paper, an approach for training Random Weight Networks is proposed based on a recent variant of particle swarm optimization called competitive swarm optimization. Unlike most of Random Weight Networks training techniques, which are used to optimize only the input weights and hidden biases, the proposed approach will automatically tune the weights, biases, the number of hidden neurons, and regularization factor as well as the embedded activation function in the network, simultaneously. The goal is to help users to effectively identify a proper structure and hyperparameter values to their applications while obtaining reasonable prediction results. Twenty benchmark classification datasets are used to compare the proposed approach with different types of basic and hybrid Random Weight Network-based models. The experimental results on the benchmark datasets show that the reasonable classification results can be obtained by automatically tuning the hyperparameters using the proposed approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. Interested readers can refer to http://unboxresearch.com/articles/randnn.html to see the visual exploration of RWN.

References

  1. Ala’M AZ, Faris H, Alqatawna JF, Hassonah MA (2018) Evolving support vector machines using whale optimization algorithm for spam profiles detection on online social networks in different lingual contexts. Knowl-Based Syst 153:91–104

    Google Scholar 

  2. Ala’M AZ, Faris H, Alqatawna JF, Hassonah MA (2020) Salp chain-based optimization of support vector machines and feature weighting for medical diagnostic information systems. In: Evolutionary machine learning techniques. pp 11–34. Springer

  3. Aljarah I, Faris H, Mirjalili S (2018) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22(1):1–15

    Google Scholar 

  4. Azzini A, Tettamanzi AGB (2011) Evolutionary anns: a state of the art survey. Intelligenza Artificiale 5(1):19–35

    Google Scholar 

  5. Bohat VK, Arya KV (2018) An effective gbest-guided gravitational search algorithm for real-parameter optimization and its application in training of feedforward neural networks. Knowl Based Syst 143:192–207

    Google Scholar 

  6. Cao J, Lin Z, Huang G-B (2012) Self-adaptive evolutionary extreme learning machine. Neural Process Lett 36(3):285–305

    Google Scholar 

  7. Cao W, Wang X, Ming Z, Gao J (2018) A review on neural networks with random weights. Neurocomputing 275:278–287

    Google Scholar 

  8. Chen S, Cowan CF, Grant PM (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(2):302–309

    Google Scholar 

  9. Chen WN, Zhang J, Lin Y, Chen N, Zhan ZH, Chung HS, Li Y, Shi YH (2013) Particle swarm optimization with an aging leader and challengers. IEEE Trans Evol Comput 17(2):241–258

    Google Scholar 

  10. Cheng R, Jin Y (2015) A competitive swarm optimizer for large scale optimization. IEEE Trans Cybern 45(2):191–204

    Google Scholar 

  11. Cho J-H, Lee D-J, Chun M-G (2007) Parameter optimization of extreme learning machine using bacterial foraging algorithm. J Korean Inst Intell Syst 17(6):807–812

    Google Scholar 

  12. Ding S, Li H, Chunyang S, Junzhao Yu, Jin F (2013) Evolutionary artificial neural networks: a review. Artif Intell Rev 39(3):251–260

    Google Scholar 

  13. Eshtay M, Faris H, Obeid N (2018) Improving extreme learning machine by competitive swarm optimization and its application for medical diagnosis problems. Expert Syst Appl 104:134–152

    Google Scholar 

  14. Eshtay M, Faris H, Obeid N (2019) Metaheuristic-based extreme learning machines: a review of design formulations and applications. Int J Mach Learn Cybern 10(6):1543–1561

    Google Scholar 

  15. Eshtay M, Faris H, Obeid N (2020) A competitive swarm optimizer with hybrid encoding for simultaneously optimizing the weights and structure of extreme learning machines for classification problems. Int J Mach Learn Cybern 7:1–23

    Google Scholar 

  16. Faris H, Ala’M AZ, Heidari AA, Aljarah I, Mafarja M, Hassonah MA, Fujita H (2019) An intelligent system for spam detection and identification of the most relevant features based on evolutionary random weight networks. Inf Fusion 48:67–83

    Google Scholar 

  17. Faris H, Aljarah I, Al-Madi N, Mirjalili S (2016) Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 25(06):1650033

    Google Scholar 

  18. Faris H, Aljarah I, Mirjalili S (2018) Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl Intell 48(2):445–464

    Google Scholar 

  19. Faris H, Heidari AA, Alaà AZ, Mafarja M, Aljarah I, Eshtay M, Mirjalili S (2020) Time-varying hierarchical chains of salps with random weight networks for feature selection. Expert Syst Appl 140:112898

    Google Scholar 

  20. Gentiluomo L, Roessner D, Augustijn D, Svilenov H, Kulakova A, Mahapatra S, Winter G, Streicher W, Rinnan Å, Peters GH et al (2019) Application of interpretable artificial neural networks to early monoclonal antibodies development. Eur J Pharm Biopharm 141:81–89

    Google Scholar 

  21. Grefenstette JJ (1986) Optimization of control parameters for genetic algorithms. IEEE Trans Syst Man Cybern, 16(1): 122–128

  22. Han F, Yao H-F, Ling Q-H (2013) An improved evolutionary extreme learning machine based on particle swarm optimization. Neurocomputing 116:87–93

    Google Scholar 

  23. He YL, Wang XZ, Huang JZ (2016) Fuzzy nonlinear regression analysis using a random weight network. Inf Sci 364:222–240

    MATH  Google Scholar 

  24. Hecht-Nielsen R (1987) Kolmogorov’s mapping neural network existence theorem. In: Proceedings of the international conference on neural networks, vol 3, pp 11–13. IEEE Press, New York

  25. Hoerl AE, Kennard RW (1970) Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12(1):55–67

    MATH  Google Scholar 

  26. Huang DS, Ip HH, Chi Z (2004) A neural root finder of polynomials based on root moments. Neural Comput 16(8):1721–1762

    MATH  Google Scholar 

  27. Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: 2004 IEEE international joint conference on neural networks, 2004. Proceedings, vol 2, pp 985–990. IEEE

  28. Hush DR (1989) Classification with neural networks: a performance analysis. In: IEEE international conference on systems engineering (1989)

  29. Igelnik B, Pao Y-H (1995) Stochastic choice of basis functions in adaptive function approximation and the functional-link net. IEEE Trans Neural Netw 6(6):1320–1329

    Google Scholar 

  30. Katuwal R, Suganthan PN, Zhang L (2018) An ensemble of decision trees with random vector functional link networks for multi-class classification. Appl Soft Comput 70:1146–1153

    Google Scholar 

  31. Kaya Y, Kayci L, Tekin R, Faruk Ertuğrul Ö (2014) Evaluation of texture features for automatic detecting butterfly species using extreme learning machine. J Exp Theor Artif Intell 26(2):267–281

    Google Scholar 

  32. Lichman M (2013) UCI machine learning repository

  33. Ma L, Khorasani K (2005) Constructive feedforward neural networks using Hermite polynomial activation functions. IEEE Trans Neural Netw 16(4):821–833

    Google Scholar 

  34. Mirjalili S, Lewis A (2013) S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14

    Google Scholar 

  35. Niu P, Ma Y, Li M, Yan S, Li G (2016) A kind of parameters self-adjusting extreme learning machine. Neural Process Lett 44(3):813–830

    Google Scholar 

  36. Nour MA, Madey GR (1996) Heuristic and optimization approaches to extending the kohonen self organizing algorithm. Euro J Oper Res 93(2):428–448

    MATH  Google Scholar 

  37. Pao YH, Park GH, Sobajic DJ (1994) Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 6(2):163–180

    Google Scholar 

  38. Ripley BD (1993) Statistical aspects of neural networks. Netw Chaos? Stat Probab Aspects 50:40–123

    MATH  Google Scholar 

  39. Sattar AM, Ertuğrul ÖF, Gharabaghi B, McBean EA, Cao J (2019) Extreme learning machine model for water network management. Neural Comput Appl 31(1):157–169

    Google Scholar 

  40. Schmidt WF, Kraaijveld MA, Duin RP (1992) Feedforward neural networks with random weights. In: 11th IAPR international conference on pattern recognition, 1992, vol. II.cconference B: pattern recognition methodology and systems, proceedings

  41. Schuster M, Paliwal KK (1997) Bidirectional recurrent neural networks. IEEE Trans Signal Process 45(11):2673–2681

    Google Scholar 

  42. Shi Y, Eberhart RC (1999) Empirical study of particle swarm optimization. In: Proceedings of the 1999 congress on evolutionary computation-CEC99 (Cat. No. 99TH8406), vol. 3, pp 1945–1950. IEEE

  43. Storn R, Price K (1997) Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J Global Opt 11(4):341–359

    MathSciNet  MATH  Google Scholar 

  44. Svozil D, Kvasnicka V, Pospichal J (1997) Introduction to multi-layer feed-forward neural networks. Chemom Intell Lab Syst 39(1):43–62

    Google Scholar 

  45. Wang J, Ye K, Cao J, Wang T, Xue A, Cheng Y, Yin C (2017) DOA estimation of excavation devices with ELM and MUSIC-based hybrid algorithm. Cogn Comput 9(4):564–580

    Google Scholar 

  46. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82

    Google Scholar 

  47. Worasucheep C (2008) A particle swarm optimization with stagnation detection and dispersion. In: 2008 IEEE congress on evolutionary computation (IEEE world congress on computational intelligence), pp 424–429. IEEE

  48. Zhai J, Wang X, Pang X (2016) Voting-based instance selection from large data sets with MapReduce and random weight networks. Inf Sci 367:1066–1077

    Google Scholar 

  49. Zhao R, Wang Y, Peng H, Jelodar H, Yuan C, Li YC, Masood I, Rabbani M (2019) Selfish herds optimization algorithm with orthogonal design and information update for training multi-layer perceptron neural network. Appl Intell 49(6):2339–2381

    Google Scholar 

  50. Zhao X, Li D, Yang B, Liu S, Pan Z, Chen H (2016) An efficient and effective automatic recognition system for online recognition of foreign fibers in cotton. IEEE Access 4:8465–8475

    Google Scholar 

  51. Zhou P, Yuan M, Wang H, Wang Z, Chai T-Y (2015) Multivariable dynamic modeling for molten iron quality using online sequential random vector functional-link networks with self-feedback connections. Inf Sci 325:237–255

    Google Scholar 

  52. Zhu QY, Qin AK, Suganthan PN, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recogn 38(10):1759–1763

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hossam Faris.

Ethics declarations

Conflict of interest

The authors declare that there is no conflict of interest regarding the publication of this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Eshtay, M., Faris, H., Heidari, A.A. et al. AutoRWN: automatic construction and training of random weight networks using competitive swarm of agents. Neural Comput & Applic 33, 5507–5524 (2021). https://doi.org/10.1007/s00521-020-05329-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-05329-0

Keywords

Navigation