Abstract
Spiking networks are third generation artificial neural networks with a higher level of biological realism. This realism comes at the cost of extra computation, which alongside their complexity makes them impractical for general machine-learning applications. We propose that for some problems, spiking networks can actually be more efficient than second generation networks. This paper presents several enhancements to the supervised learning algorithm SpikeProp, including reduced precision, fewer subconnections, a lookup table and event-driven computation. The cputime required by our new algorithm SpikeProp+ was measured and compared to multilayer perceptron backpropagation. We found SpikeProp+ to use 20 times less CPU than SpikeProp for learning a classifier, but it remains ten times slower than the perceptron network. Our new networks are not optimal however, and several avenues exist for achieving further gains. Our results suggest it may be possible to build highly-efficient neural networks in this way.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Maass, W.: Networks of spiking neurons: The third generation of neural network models. Neural Networks 10(9), 1659–1671 (1997)
Bohte, S.M., Kok, J.N., Poutré, H.L.: Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1–4), 17–37 (2002)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. tech. rep., Institute for Cognitive Science, University of California San Diego (September 1985)
Mattia, M., Giudice, P.D.: Efficient event-driven simulation of large networks of spiking neurons and dynamical synapses. Neural Computation 12(10), 2305–2329 (2000)
Ros, E., Carrillo, R., Ortigosa, E.M., Barbour, B., AgÃs, R.: Event-driven simulation scheme for spiking neural networks using lookup tables to characterize neuronal dynamics. Neural Computation 18, 2959–2993 (2006)
Laughlin, S.B.: Energy as a constraint on the coding and processing of sensory information. Current Opinion in Neurobiology 11(11), 475–480 (2001)
Thiruvarudchelvan, V., Crane, J.W., Bossomaier, T.: Analysis of spikeprop convergence with alternative spike response functions. In: IEEE Symposium on Foundations of Computational Intelligence (April 2013)
Esmaeilzadeh, H., Cao, T., Yang, X., Blackburn, S.M., McKinley, K.S.: Looking back and looking forward: power, performance, and upheaval. Commun. ACM 55, 105–114 (2012)
Bache, K., Lichman, M.: UCI Machine Learning Repository (2013), http://archive.ics.uci.edu/ml
Sporea, I., Grüning, A.: Reference time in spikeprop. In: The 2011 International Joint Conference on Neural Networks (IJCNN), July 31-August 5, pp. 1090–1092 (2011)
Wilson, D., Martinez, T.R.: The general inefficiency of batch training for gradient descent learning. Neural Networks 16(10), 1429–1451 (2003)
Schrauwen, B., Van Campenhout, J.: Extending spikeprop. In: Proceedings of the IEEE International Joint Conference on Neural Networks, vol. 1, p. 4, vol. (xlvii+3302) (July 2004)
McKennoch, S., Liu, D., Bushnell, L.: Fast modifications of the spikeprop algorithm. In: International Joint Conference on Neural Networks IJCNN 2006, pp. 3970–3977 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Thiruvarudchelvan, V., Moore, W., Antolovich, M. (2013). Improving the Efficiency of Spiking Network Learning. In: Lee, M., Hirose, A., Hou, ZG., Kil, R.M. (eds) Neural Information Processing. ICONIP 2013. Lecture Notes in Computer Science, vol 8227. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42042-9_22
Download citation
DOI: https://doi.org/10.1007/978-3-642-42042-9_22
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-42041-2
Online ISBN: 978-3-642-42042-9
eBook Packages: Computer ScienceComputer Science (R0)