Skip to main content

Advertisement

Log in

WOLIF: An efficiently tuned classifier that learns to classify non-linear temporal patterns without hidden layers

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

We present in this paper a computationally efficient and biologically plausible classifier WOLIF, using Grey Wolf Optimizer (GWO) tuned error function obtained from Leaky-Integrate-and-Fire (LIF) spiking neuron. Unlike traditional artificial neuron, spiking neuron is capable of intelligently classifying non-linear temporal patterns without hidden layer(s), which makes a Spiking Neural Network (SNN) computationally efficient. There is no additional cost of adding hidden layer(s) in SNN, it is also biologically plausible, and energy efficient. Since supervised learning rule for SNN is still in infancy stage, we introduced WOLIF classifier and its supervised learning rule based on GWO algorithm. WOLIF uses a single LIF neuron thereby use less network parameters, and homo-synaptic static long-term synaptic weights (both excitatory and inhibitory). Note that, WOLIF also reduces the total simulation time which improves computational efficiency. It is benchmarked on seven different datasets drawn from the UCI machine learning repository and found better results both in terms of accuracy and computational cost than state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Maass W (1997) Networks of spiking neurons: the third generation of neural network models. Neural Networks 10(9):1659–1671

    Article  Google Scholar 

  2. Ahmadian S, Khanteymoori AR (2015) Training back propagation neural networks using asexual reproduction optimization. In: 2015 7Th conference on information and knowledge technology (IKT), IEEE, pp 1–6

  3. Jalali SMJ, Ahmadian S, Kebria PM, Khosravi A, Lim Chee P, Nahavandi S (2019) Evolving artificial neural networks using butterfly optimization algorithm for data classification. In: International conference on neural information processing, Springer, pp 596–607

  4. Jalali SMJ, Ahmadian S, Khosravi A, Mirjalili S, Mahmoudi MR, Nahavandi S (2020) Neuroevolution-based autonomous robot navigation: A comparative study. Cognitive Systems Research

  5. Cariani PA (2004) Temporal codes and computations for sensory representation and scene analysis. IEEE Trans Neural Netw 15(5):1100–1111

    Article  Google Scholar 

  6. Gautrais J, Thorpe S (1998) Rate coding versus temporal order coding: a theoretical approach. Biosystems 48(1-3):57–65

    Article  Google Scholar 

  7. Maas W (1997) Noisy spiking neurons with temporal coding have more computational power than sigmoidal neurons. Adv Neural Inform Process Syst 9:211–217

    Google Scholar 

  8. Gerstner W, Kistler WM (2002) Spiking neuron models: Single neurons, populations, plasticity. Cambridge University Press, Cambridge

    Book  Google Scholar 

  9. Bialek W, Rieke F, De Ruyter Van Steveninck RR, Warland D (1991) Reading a neural code. Science 252(5014):1854–1857

    Article  Google Scholar 

  10. Natschläger T, Ruf B (1998) Spatial and temporal pattern analysis via spiking neurons. Network: Computation in Neural Systems 9(3):319–332

    Article  Google Scholar 

  11. Bohte SM, Kok JN, La Poutre H (2002) Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1-4):17–37

    Article  Google Scholar 

  12. Kistler WM, Gerstner W, Leo van Hemmen J (1997) Reduction of the hodgkin-huxley equations to a single-variable threshold model. Neural Comput 9(5):1015–1045

    Article  Google Scholar 

  13. Gerstner W (1995) Time structure of the activity in neural network models. Phys Rev E 51 (1):738

    Article  MathSciNet  Google Scholar 

  14. Hodgkin AL, Huxley AF (1952) A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of Physiology 117(4):500–544

    Article  Google Scholar 

  15. Stein RB (1965) A theoretical analysis of neuronal variability. Biophys J 5(2):173–194

    Article  Google Scholar 

  16. Stein RB (1967) Some models of neuronal variability. Biophysical J 7(1):37–68

    Article  Google Scholar 

  17. Vazquez RA, Cachón A (2010) Integrate and fire neurons and their application in pattern recognition. In: 2010 7Th international conference on electrical engineering computing science and automatic control, IEEE, pp 424–428

  18. Gütig R, Sompolinsky H (2006) The tempotron: a neuron that learns spike timing–based decisions. Nature Neuroscience 9(3):420–428

    Article  Google Scholar 

  19. Wade JJ, McDaid LJ, Santos JA, Sayers HM (2010) Swat: a spiking neural network training algorithm for classification problems. IEEE Trans Neural Networks 21(11):1817–1830

    Article  Google Scholar 

  20. Tsodyks MV, Markram H (1996) Plasticity of neocortical synapses enables transitions between rate and temporal coding. In: International conference on artificial neural networks, Springer, pp 445–450

  21. Ponulak F, Kasiński A (2010) Supervised learning in spiking neural networks with resume: sequence learning, classification, and spike shifting. Neural Computation 22(2):467–510

    Article  MathSciNet  Google Scholar 

  22. Mohemmed A, Schliebs S, Matsuda S, Kasabov N (2012) Span: Spike pattern association neuron for learning spatio-temporal spike patterns. Int J Neural Syst 22(04):1250012

    Article  Google Scholar 

  23. Florian RV (2012) The chronotron: A neuron that learns to fire temporally precise spike patterns. PloS One 7(8):1–27

    Article  Google Scholar 

  24. Mostafa H (2017) Supervised learning based on temporal coding in spiking neural networks. IEEE Trans Neural Netw Learn Syst 29(7):3227–3235

    Google Scholar 

  25. Dora S, Subramanian K, Suresh S, Sundararajan N (2016) Development of a self-regulating evolving spiking neural network for classification problem. Neurocomputing 171:1216–1229

    Article  Google Scholar 

  26. Wang J, Belatreche A, Maguire L, Mcginnity TM (2014) An online supervised learning method for spiking neural networks with adaptive structure. Neurocomputing 144:526–536

    Article  Google Scholar 

  27. Jeyasothy A, Sundaram S, Sundararajan N (2018) Sefron: a new spiking neuron model with time-varying synaptic efficacy function for pattern classification. IEEE Trans Neural Netw Learn Syst 30(4):1231–1240

    Article  Google Scholar 

  28. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Advances Eng Soft 69:46–61

    Article  Google Scholar 

  29. Wolberg WH, Mangasarian OL (1990) Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proceedings of the National Academy of Sciences 87(23):9193–9196

    Article  Google Scholar 

  30. Wolberg WH (1992) Breast cancer wisconsin dataset. http://archive.ics.uci.edu/ml/datasets/

  31. Sigillito VG, Wing SP, Hutton LV, Baker KB (1989) Classification of radar returns from the ionosphere using neural networks. Johns Hopkins APL Technical Digest 10(3):262–266

    Google Scholar 

  32. Dua D, Graff C (2017) UCI machine learning repository

  33. McDermott J, Forsyth RS (2016) Diagnosing a disorder in a classification benchmark. Pattern Recogn Lett 73:41–43

    Article  Google Scholar 

  34. Fisher RA (1936) The use of multiple measurements in taxonomic problems. Annals of Eugenics 7(2):179–188

    Article  Google Scholar 

  35. Bhatt R (2005) Fuzzy-rough approaches for pattern classification: Hybrid measures, mathematical analysis, feature selection algorithms, decision tree algorithms, neural learning, and applications. In: Decision tree algorithms, neural learning, and applications. Amazon books

  36. Rohra JG, Perumal B, Narayanan SJ, Thakur P, Bhatt RB (2017) User localization in an indoor environment using fuzzy hybrid of particle swarm optimization & gravitational search algorithm with neural networks. In: Proceedings of Sixth international conference on soft computing for problem solving, Springer, pp 286–295

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Irshed Hussain.

Ethics declarations

Conflict of interests

The first author and second author declare no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hussain, I., Thounaojam, D.M. WOLIF: An efficiently tuned classifier that learns to classify non-linear temporal patterns without hidden layers. Appl Intell 51, 2173–2187 (2021). https://doi.org/10.1007/s10489-020-01934-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-020-01934-7

Keywords

Navigation