Abstract
Human information processing depends mainly on billions of neurons which constitute a complex neural network, and the information is transmitted in the form of neural spikes. In this paper, we propose a spiking neural network (SNN), named MD-SNN, with three key features: (1) using receptive field to encode spike trains from images; (2) randomly selecting partial spikes as inputs for each neuron to approach the absolute refractory period of the neuron; (3) using groups of neurons to make decisions. We test MD-SNN on the MNIST data set of handwritten digits, and results demonstrate that: (1) Different sizes of receptive fields influence classification results significantly. (2) Considering the neuronal refractory period in the SNN model, increasing the number of neurons in the learning layer could greatly reduce the training time, effectively reduce the probability of over-fitting, and improve the accuracy by 8.77%. (3) Compared with other SNN methods, MD-SNN achieves a better classification; compared with the convolution neural network, MD-SNN maintains flip and rotation invariance (the accuracy can remain at 90.44% on the test set), and it is more suitable for small sample learning (the accuracy can reach 80.15% for 1000 training samples, which is 7.8 times that of CNN).
Similar content being viewed by others
References
Al-Amri SS, Kalyankar NV, Khamitkar SD, 2010. Image segmentation by using edge detection. Int J Comput Sci Eng, 2(3):804–807.
Berry MJII, Meister M, 1998. Refractoriness and neural precision. Proc Conf on Advances in Neural Information Processing Systems 10, p.110–116.
Bi GQ, Poo MM, 2001. Synaptic modification by correlated activity: Hebb’s postulate revisited. Ann Rev Neurosci, 24(1):139–166. https://doi.org/10.1146/annurev.neuro.24.1.139
Brette R, Gerstner W, 2005. Adaptive exponential integrateand-fire model as an effective description of neuronal activity. J Neurophysiol, 94(5):3637–3642. https://doi.org/10.1152/jn.00686.2005
Burt P, Adelson E, 1983. The laplacian pyramid as a compact image code. IEEE Trans Commun, 31(4):532–540. https://doi.org/10.1109/TCOM.1983.1095851
Canny J, 1986. A computational approach to edge detection. IEEE Trans Patt Anal Mach Intell, 8(6):679–698. https://doi.org/10.1109/TPAMI.1986.4767851
Coates A, Ng A, Lee H, 2011. An analysis of single-layer networks in unsupervised feature learning. Proc 14th Int Conf on Artificial Intelligence and Statistics, p.215–223.
Dasgupta S, Stevens CF, Navlakha S, 2017. A neural algorithm for a fundamental computing problem. Science, 358(6364):793–796. https://doi.org/10.1126/science.aam9868
Dora S, Suresh S, Sundararajan N, 2015a. A sequential learning algorithm for a spiking neural classifier. Appl Soft Comput, 36:255–268. https://doi.org/10.1016/j.asoc.2015.06.062
Dora S, Sundaram S, Sundararajan N, 2015b. A two stage learning algorithm for a growing-pruning spiking neural network for pattern classification problems. Int Joint Conf on Neural Networks, p.1–7. https://doi.org/10.1109/ijcnn.2015.7280592
Dora S, Subramanian K, Suresh S, et al., 2016. Development of a self-regulating evolving spiking neural network for classification problem. Neurocomputing, 171:1216–1229. https://doi.org/10.1016/j.neucom.2015.07.086
Dora S, Suresh S, Sundararajan N, 2017. Online meta-neuron based learning algorithm for a spiking neural classifier. Inform Sci, 414:19–32. https://doi.org/10.1016/j.ins.2017.05.050
Fukushima K, 1980. Neocognitron: a self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol Cybern, 36(4):193–202. https://doi.org/10.1007/bf00344251
Gerstner W, Kistler W, 2002. Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge, UK. https://doi.org/10.1017/CBO9780511815706
Ghosh Dastidar S, Adeli H, 2007. Improved spiking neural networks for eeg classification and epilepsy and seizure detection. Integr Comput Aided Eng, 14(3):187–212.
Gilbert CD, Wiesel TN, 1992. Receptive field dynamics in adult primary visual cortex. Nature, 356(6365):150–152. https://doi.org/10.1038/356150a0
Gütig R, Sompolinsky H, 2006. The tempotron: a neuron that learns spike timing-based decisions. Nat Neurosci, 9(3):420–428. https://doi.org/10.1038/nn1643
Hannun AY, Case C, Casper J, et al., 2014. Deep speech: Scaling up end-to-end speech recognition. https://arxiv.org/abs/1412.5567
He KM, Zhang XY, Ren SQ, et al., 2016. Deep residual learning for image recognition. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.770–778. https://doi.org/10.1109/CVPR.2016.90
Hodgkin AL, Huxley AF, 1952. A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol, 117(4):500–544. https://doi.org/10.1113/jphysiol.1952.sp004764
Hubel DH, Wiesel TN, 1962. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. J Physiol, 160(1):106–154. https://doi.org/10.1113/jphysiol.1962.sp006837
Hussain S, Liu SC, Basu A, 2014. Improved margin multiclass classification using dendritic neurons with morphological learning. IEEE Int Symp on Circuits and Systems, p.2640–2643. https://doi.org/10.1109/iscas.2014.6865715
Izhikevich EM, 2001. Resonate-and-fire neurons. Neur Networks, 14(6-7):883–894. https://doi.org/10.1016/s0893-6080(01)00078-8
Izhikevich EM, 2003. Simple model of spiking neurons. IEEE Trans Neur Networks, 14(6):1569–1572. https://doi.org/10.1109/tnn.2003.820440
Izhikevich EM, 2004. Which model to use for cortical spiking neurons? IEEE Trans Neur Networks, 15(5):1063–1070. https://doi.org/10.1109/tnn.2004.832719
LeCun Y, Bengio Y, Hinton G, 2015. Deep learning. Nature, 521(7553):436–444. https://doi.org/10.1038/nature14539
Legenstein R, Naeger C, Maass W, 2006. What can a neuron learn with spike-timing-dependent plasticity? Neur Comput, 17(11):2337–2382. https://doi.org/10.1162/0899766054796888
Ma YQ, Wu H, Zhu MJ, et al., 2017. Reconstruction of visual image from functional magnetic resonance imaging using spiking neuron model. IEEE Trans Cogn Dev Syst, in press. https://doi.org/10.1109/tcds.2017.2764948
Maass W, 1997. Networks of spiking neurons: the third generation of neural network models. Neur Networks, 10(9):1659–1671. https://doi.org/10.1016/s0893-6080(97)00011-7
Masquelier T, Guyonneau R, Thorpe SJ, 2009. Competitive stdp-based spike pattern learning. Neur Comput, 21(5):1259–1276. https://doi.org/10.1162/neco.2008.06-08-804
Merolla P, Arthur J, Akopyan F, et al., 2011. A digital neurosynaptic core using embedded crossbar memory with 45pj per spike in 45nm. IEEE Custom Integrated Circuits Conf, p.1–4. https://doi.org/10.1109/cicc.2011.6055294
Ponulak F, Kasinski A, 2010. Supervised learning in spiking neural networks with resume: sequence learning, classification, and spike shifting. Neur Comput, 22(2):467–510. https://doi.org/10.1162/neco.2009.11-08-901
Rodieck RW, 1965. Quantitative analysis of cat retinal ganglion cell response to visual stimuli. Vis Res, 5(11):583–601. https://doi.org/10.1016/0042-6989(65)90033-7
Schmidhuber J, 2015. Deep learning in neural networks: An overview. Neur networks, 61:85–117. https://doi.org/10.1016/j.neunet.2014.09.003
Sobel I, 2014. History and definition of the sobel operator. https://www.scribd.com/document/271811982/Historyand-Definition-of-Sobel-Operator
Tang H, Yu Q, Tan KC, 2012. Learning real-world stimuli by single-spike coding and tempotron rule. Int Joint Conf on Neural Networks, p.1–6. https://doi.org/10.1109/ijcnn.2012.6252369
Tavanaei A, Maida AS, 2015. A minimal spiking neural network to rapidly train and classify handwritten digits in binary and 10-digit tasks. Int J Adv Res Artif Intell, 4(7):1–8. https://doi.org/10.14569/ijarai.2015.040701
Thorpe S, Delorme A, van Rullen R, 2001. Spike-based strategies for rapid processing. Neur Netw, 14(67):715–725. https://doi.org/10.1016/s0893-6080(01)00083-1
Victor JD, Purpura KP, 1996. Nature and precision of temporal coding in visual cortex: a metric-space analysis. J Neurophysiol, 76(2):1310–1326. https://doi.org/10.1152/jn.1996.76.2.1310
Wade JJ, Mcdaid LJ, Santos JA, et al., 2010. SWAT: a spiking neural network training algorithm for classification problems. IEEE Trans Neur Networks, 21(11):1817–1830. https://doi.org/10.1109/TNN.2010.2074212
Xie XR, Qu H, Yi Z, et al., 2017. Efficient training of supervised spiking neural network via accurate synapticefficiency adjustment method. IEEE Trans Neur Networks, 28(6):1411–1424. https://doi.org/10.1109/tnnls.2016.2541339
Yeomans JS, 1979. The absolute refractory periods of selfstimulation neurons. Phys Behav, 22(5):911–919. https://doi.org/10.1016/0031-9384(79)90336-6
Yu Q, Tang HJ, Tan KC, et al., 2013. Rapid feedforward computation by temporal encoding and learning with spiking neurons. IEEE Trans Neur Networks, 24(10):1539–1552. https://doi.org/10.1109/TNNLS.2013.2245677
Yu Q, Tang HJ, Tan KC, et al., 2014. A brain-inspired spiking neural network model with temporal encoding and learning. Neurocomputing, 138:3–13. https://doi.org/10.1016/j.neucom.2013.06.052
Zenke F, Ganguli S, 2017. Superspike: supervised learning in multi-layer spiking neural networks. https://arxiv.org/abs/1705.11146
Author information
Authors and Affiliations
Corresponding author
Additional information
Project supported by the National Natural Science Foundation of China (Nos. 61773312, 61773307, and L1522023), the China Postdoctoral Science Foundation (No. 2016M590949), and the National Basic Research Program (973) of China (No. 2015CB351703)
Rights and permissions
About this article
Cite this article
Ma, Yq., Wang, Zr., Yu, Sy. et al. A novel spiking neural network of receptive field encoding with groups of neurons decision. Frontiers Inf Technol Electronic Eng 19, 139–150 (2018). https://doi.org/10.1631/FITEE.1700714
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1631/FITEE.1700714