Abstract
``Neural computing'' is a research field based on perceiving the human brain as an information system. This system reads its input continuously via the different senses, encodes data into various biophysical variables such as membrane potentials or neural firing rates, stores information using different kinds of memories (e.g., short-term memory, long-term memory, associative memory), performs some operations called ``computation'', and outputs onto various channels, including motor control commands, decisions, thoughts, and feelings. We show a natural model of neural computing that gives rise to hyper-computation. Rigorous mathematical analysis is applied, explicating our model's exact computational power and how it changes with the change of parameters. Our analog neural network allows for supra-Turing power while keeping track of computational constraints, and thus embeds a possible answer to the superiority of the biological intelligence within the framework of classical computer science. We further propose it as standard in the field of analog computation, functioning in a role similar to that of the universal Turing machine in digital computation. In particular an analog of the Church-Turing thesis of digital computation is stated where the neural network takes place of the Turing machine.
Similar content being viewed by others
References
Balcázar, J.L., Gavaldà, R. and Siegelmann, H.T. (1997), 'Computational Power of Neural Networks: A Characterization in Terms of Kolmogorov Complexity', IEEE Transactions on Information Theory 43(4), pp. 1175–1183.
Blum, L., Shub, M. and Smale, S. (1989), 'On a Theory of Computation and Complexity Over the Real Numbers: NP Completeness, Recursive Functions, and Universal Machines,' Bull. A.M.S. 21, pp. 1–46.
Copeland, B.J. (2000), 'Narrow Versus Wide Mechanism', Journal of Philosophy 97, pp. 5–32.
Copeland, B.J. and Proudfoot, D. (1999), 'Alan Turing's Forgotten Ideas in Computer Science', Scientific American 280, pp. 99–103.
Hopfield, J.J. and Tank, D.W. (1985), 'Neural Computation of Decisions in Optimization Problems', Biological Cybernetics 52, pp. 141–152.
Karp, R.M. and Lipton, R. (1982), 'Turing Machines That Take Advice', Enseignment Mathematique 28, pp. 191–209.
Kay, L.M., Lancaster, L.R. and Freeman, W.J. (1996), 'Reafference and Attractors in the Olfactory System During Odor Recognition', International Journal of Neural Systems 7(4), pp. 489–495.
Koch, C. and Crick, F.C. (2000), in M.S. Gazzaniga, ed., Some Thoughts on Consciousness and Neuroscience The Cognitive Neurosciences, 2nd edition, MIT Press, Cambridge, MA, pp. 1285–1294.
Maass, W. (1996), 'Networks of Spiking Neurons: The Third Generation of Neural Network Models', Electronic Colloquium on Computational Complexity (ECCC) 3(031).
Nyce, J. (1992), 'Analogy or Identity: Brain and Machine' at the Macy Conferences on Cybernetics SIGBIO Newsletter: Published by the Association for Computing Machinery, Special Interest Group on Biomedial Computing 12, pp. 32–37.
Orponen, P. (1997), 'A Survey of Continuous-Time Computation Theory,' in D.-Z. Du and K.-I Ko, eds, Advances in Algorithms, Languages, and Complexity, Dordrecht: Kluwer Academic Publishers, pp. 209–224.
Penrose, R. (1989), The Emperor's New Mind, Oxford: Oxford University Press.
Pour-El, M.B. (1974), 'Abstract Computability and its Relation to the General Purpose Analog Computer (Some Connections Between Logic, Differential Equations and Analog Computers)', Transactions of the American Mathematical Society 199, pp. 1–29.
Rumelhart, D.E., Hinton, G.E. and Williams, R.J. (1986), 'Learning Representations by Back Propagating Errors', Nature 323, pp. 533–536.
Shannon, C.E. (1941), 'Mathematical Theory of the Differential Analyzer', Journal of Mathematics and Physics of the Massachusetts Institute of Technology 20, pp. 337–354.
Siegelmann, H.T. (1995), 'Computation Beyond the Turing Limit', Science 238(28), pp. 632–637.
Siegelmann, H.T. (1998), Neural Networks and Analog Computation: Beyond the Turing Limit, Boston MA: Birkhauser.
Siegelmann, H.T. (1999), 'Stochastic Analog Networks and Computational Complexity', Journal of Complexity 15(4), pp. 451–475.
Siegelmann, H.T. (2002), 'Neural Automata and Analog Computational Complexity', in M.A. Arbib, ed., 2nd edition, The Handbook of Brain Theory and Neural Networks, Cambridge, MA: MIT Press, in press.
Siegelmann, H.T., Ben-Hur, A. and Fishman, S. (1999), 'Computational Complexity for continuous Time Dynamics,' Physical Review Letters 83(7), pp. 1463–1466 (Full version to appear in Journal of Complexity).
Siegelmann H.T. and Fishman S. (1998), 'Computation by Dynamical Systems,' Physica D 120, pp. 214–235
Siegelmann, H.T. and Sontag, E.D. (1994), 'Analog Computation via Neural Networks,' Theoretical Computer Science 131, pp. 311–360.
Siegelmann, H.T. and Sontag, E.D. (1995), 'Computational Power of Neural Networks,' Journal of Computer System Sciences 50(1), pp. 132–150.
von Neumann, J. (1958), The Computer and the Brain, New Haven: Yale University Press.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Siegelmann, H.T. Neural and Super-Turing Computing. Minds and Machines 13, 103–114 (2003). https://doi.org/10.1023/A:1021376718708
Issue Date:
DOI: https://doi.org/10.1023/A:1021376718708