Abstract
Biological neural networks (BNNs) have inspired the creation of artificial neural networks (ANNs) [19]. One of the properties of BNNs is computational robustness, but this property is often overlooked in computer science because ANNs are usually virtualizations executed in a physical machine that lacks computational robustness. However, it was recently proposed that computational robustness could be a key feature that drives the selection of the computational model in the evolution of animals [20]. Until now, only energetic cost and processing time had been considered as the features that drove the evolution of the nervous system. The new standpoint leads us to consider whether computational robustness could have driven the evolution of not only the computational model but also other nervous system traits in animals through the process of natural selection. Because an important feature of an animal’s nervous system is its neural code, we tested the relationship among the computational properties of feed-forward neural networks and the neural codes through in silico experiments. We found two main results: There is a relationship between the number of epochs needed to train a feed-forward neural network using back-propagation and the neural code of the neural network, and a relationship exists between the computational robustness and the neural code of a feed-forward neural network. The first result is important to ANNs and the second to BNNs.
This research was supported by Universidad Isabel I. I would like to thank Lori-Ann Tuscan for proofreading this paper.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Aggarwal, C.: Neural Networks and Deep Learning: A Textbook. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-94463-0
Albert, R., et al.: Error and attack tolerance of complex networks. Nature 406, 378–382 (2000)
Antonopoulos, C., et al.: Evaluating performance of neural codes in model neural communication networks. Neural Netw. 109, 90–102 (2019)
Beaulieu-Laroche, L., et al.: Enhanced dendritic compartmentalization in human cortical neurons. Cell 175(3), 643–651 (2018)
Bullmore, E., Sporns, O.: The economy of brain network organization. Nat. Rev. Neurosci. 12, 336–349 (2012)
Cherniak, C.: Component placement optimization in the brain. J. Neurosci. 14(4), 2418–2427 (1994)
Ghosh, A., Pal, N., Pal, S.: Modeling of component failure in neural networks for robustness evaluation: an application to object extraction. IEEE Trans. Neural Netw. 6(3), 648–656 (1995)
Ghosh, A., Tanaka, H.: On making neural network based learning systems robust. IETE J. Res. 44(4–5), 219–225 (1998)
Guerguiev, J., et al.: Towards deep learning with segregated dendrites. eLife 6, e22901 (2017)
Gulyás, A.E.A.: Navigable networks as nash equilibria of navigation games. Nat. Commun. 6(7651), 1–10 (2015)
Kalampokis, A., et al.: Robustness in biological neural networks. Physica A: Stat. Mech. Appl. 317(3–4), 581–590 (2003)
Kazantsev, V.B., et al.: Self-referential phase reset based on inferior olive oscillator dynamics. Proc. Nat. Acad. Sci. 101(52), 18183–18188 (2004)
Kong, Q., et al.: Efficient coding matters in the organization of the early visual system. Neural Netw. 105, 218–226 (2018)
Laughlin, S.B., Sejnowski, T.J.: Communication in neural networks. Science 301(5641), 1870–1874 (2003)
Lianchun, Y., Yuguo, Y.: Energy-efficient neural information processing in individual neurons and neuronal networks. J. Neurosci. Res. 95(11), 2253–2266 (2017)
Lucal, H.M.: Arithmetic operations for digital computers using a modified reflected binary code. IRE Trans. Electron. Comput. EC-8(4), 449–458 (1959)
Makarenko, V., Llinás, R.: Experimentally determined chaotic phase synchronization in a neuronal system. Proc. Nat. Acad. Sci. 95(26), 15747–15752 (1998)
Manin, Y.I.: Error-correcting codes and neural networks. Sel. Math. 24(1), 521–530 (2018)
McCulloch, W., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5, 115–133 (1943)
Miguel-Tomé, S.: The influence of computational traits on the natural selection of the nervous system. Natural Comput. 17(2), 403–425 (2018)
Moreno, H., et al.: Synaptic transmission block by presynaptic injection of oligomeric amyloid beta. Proc. Nat. Acad. Sci. 106(14), 5901–5906 (2009)
Pryluk, R., et al.: A tradeoff in the neural code across regions and species. Cell 176(3), 597–609.e18 (2019)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. 1, pp. 318–362. MIT Press (1986)
Werbos, P.: Beyond regression: new tools for prediction and analysis in the behavior sciences. Doctor in philosophy, Harvard University (1974)
Yeung, R.: Information Theory and Network Coding. Springer, Boston (2008). https://doi.org/10.1007/978-0-387-79234-7
Yuste, R.: From the neuron doctrine to neural networks. Nat. Rev. Neurosci. 16, 487–497 (2015)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Miguel-Tomé, S. (2019). An Experimental Study on the Relationships Among Neural Codes and the Computational Properties of Neural Networks. In: Ferrández Vicente, J., Álvarez-Sánchez, J., de la Paz López, F., Toledo Moreo, J., Adeli, H. (eds) From Bioinspired Systems and Biomedical Applications to Machine Learning. IWINAC 2019. Lecture Notes in Computer Science(), vol 11487. Springer, Cham. https://doi.org/10.1007/978-3-030-19651-6_5
Download citation
DOI: https://doi.org/10.1007/978-3-030-19651-6_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-19650-9
Online ISBN: 978-3-030-19651-6
eBook Packages: Computer ScienceComputer Science (R0)