Abstract
It is argued that the genetic information necessary to encode an algorithmic neural processor tutoring an otherwise randomly connected biological neural net is represented by the entropy of the analogous minimal Turing machine. Such a near-minimal machine is constructed performing the whole range of bivalent propositional logic in n variables. Neural nets computing the same task are presented; their informational entropy can be gauged with reference to the analogous Turing machine. It is also shown that nets with one hidden layer can be trained to perform algorithms solving propositional logic by error back-propagation.
Similar content being viewed by others
References
Arbib MA (1987) Brains, machines, and mathematics. Springer, Berlin Heidelberg New York
Ash RB (1965) Information theory. Dover, New York
Bergmann E, Noll H (1977) Mathematische Logik mit Informatik-Anwendungen. Springer, Berlin Heidelberg New York
Börger E (1986) Berechenbarkeit, Komplexität, Logik. Vieweg, Braunschweig
Cover TM, Thomas JA (1991) Elements of information theory. Wiley. New York
Denker JS, Schwartz D, Wittner B, Solla S, Howard R, Jackel L, Hopfield J (1987) Large automatic learning, rule extraction, and generalization. Complex Syst 1:877
Gallagher R (1968) Information theory and reliable communication. Wiley New York
Hampson SE, Volper DJ (1987) Disjunctive models of Boolean category learning. Biol Cybern 56:121–137
Hermes H (1961) Aufzählbarkeit, Entscheidbarkeit, Berechenbarkeit. Springer, Berlin Heidelberg New York
Hertz J, Krogh A, Palmer R (1991) Introduction to the theory of neural computation. Addison-Wesley, Reading, Mass
Hofacker GL, Borstnik B, Schöniger M (1988) Evolutionary adaptation to a real and an artificial world. In: Clementi E, Chin S (eds) Biological and artificial intelligence systems. ESCOM, Leiden
Hopcroft JE, Ullmann JD (1988) Einführung in die Automatentheorie, Formale Sprachen und Komplexitätstheorie. Addison-Wesley, Reading, Mass
Martin AP, Palumbi SR (1993) Body size, metabolic rate, generation time, and the molecular clock. Proc Natl Acad Sci USA 90:4087–4091
Minsky ML (1967) Computation: Finite and infinite machines. Prentice-Hall, Englewood Cliffs, NJ
Minsky ML, Papert S (1969) Perceptrons: an introduction to computational geometry. MIT Press, Cambridge, Mass
Pinkas G (1991) Symmetric neural networks and propositional logic satisfiability. Neural Comput 3:282–291
Ritz B, Hofacker GL (1995) The information contents of neural nets performing object-adapted algorithmic and logical tasks. In: Elsner N, Menzel R (eds) Goettingen neurobiology report 1995. Thieme, Stuttgart, p 883
Rumelhart DE, Hinton GE, Williams RJ (1986a) Learning representations by back-propagating errors. Nature 323:533–536
Rumelhart DE, Hinton GE, Williams RJ (1986b) Learning internal representations by error propagation. In: Rumelhart DE, McClelland JL (eds) Parallel distributed processing: exploration in the microstructure of cognition, vol. 1. MIT Press, Mass
Shannon CE, Weaver W (1963) The mathematical theory of communication. University of Illinois Press, Urbana
Siegelmann HT (1995) Computation beyond the turing limit. Science 268:545–548
Turing AM (1936) On computable numbers, with an application to the Entscheidungsproblem. Proc Lond Math Soc, Ser 2, 42:230–265
Turing AM (1969) Intelligent machinery. In: Meltzer B, Michie D (eds) Machine intelligence 5. Edinburgh
Valiant LG (1984) A theory of the learnable. Communications of the ACM, Vol 27, no. 11
Zell A et al. (1993) SNNS: Stuttgart Neural Network Simulator, user manual (v3.1 report no. 3/93 (revised)
Author information
Authors and Affiliations
Additional information
This article was processed by the author using the LaTEX style file pljour2 from Springer-Verlag.
Rights and permissions
About this article
Cite this article
Ritz, B.M., Hofacker, G.L. Informational properties of neural nets performing algorithmic and logical tasks. Biol. Cybern. 74, 549–555 (1996). https://doi.org/10.1007/BF00209426
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF00209426