Elsevier

Neurocomputing

Volumes 52–54, June 2003, Pages 541-546
Neurocomputing

Adaptation using local information for maximizing the global cost

https://doi.org/10.1016/S0925-2312(02)00779-8Get rights and content

Abstract

Recently the information transmission properties of noisy, parallel summing threshold arrays, have been investigated and interpreted in a neural coding context (see Stocks, Phys. Rev. Lett. 84 (2000) 2310; Phys. Rev. E 63 (2001) 1). The mutual information between certain stimuli and corresponding responses displays a maximum as a function of the noise level. This optimal noise level depends on the number N of neurons within the array, information that is not locally available for single neuron adaptation. We give an analytic expression for the optimal noise level, that only depends on locally available information. The result is based upon an approximation to the mutual information. In the large N limit both descriptions coincide.

Introduction

A noise induced maximum in the mutual information is a signature of stochastic resonance (see [3] for a review). Stochastic resonance has been extensively studied in the context of single neurons (see e.g. [4]). Recently the information transmission properties of noisy, parallel summing threshold arrays, have been investigated and interpreted in a neural coding context (see Stocks [5], [7], [6]). The mutual information between certain stimulus and response distributions is maximized as a function of the noise level. This optimal noise level depends on the number N of array elements. Adaptation of single neurons to the optimal noise level would require the knowledge of N. But this is information about the global architecture of the system and it is not—in an obvious way—available locally at the single cell. In this study we give an analytic expression for the optimal noise level, which depends only on information which is—in principal—locally available at any single neuron. This local optimality criterium is based upon an approximation of the mutual information. In the large N limit both descriptions yield the same optimal noise level. The quality of the approximation is investigated numerically.

This paper is organized as follows. First we introduce the abstract model which describes the input–output relationship of a summing array of parallel threshold elements. In Section 3 the mutual information between a distribution of inputs and a distributions of outputs is introduced. Here we give an approximation to the standard mutual information, based on the work of Brunel et al. [1] and the model given in Section 2. Using this approximation we deduce an analytical expression for the optimal noise level which depends only on information, which is locally available at any single neuron. Section 4 contains numerical results in which we demonstrate the quality of our approximation. Section 5, finally, concludes with a brief discussion.

Section snippets

Model

The model [5] consists of a parallel summing array of N threshold devices (Fig. 1). The input to each threshold device is the sum of the signal X and the noise ηi, which is compared to a constant threshold Θ, to yield the following input–output relation:yi=sign(x+ηi−Θ).The signal X is the same for all threshold devices and is drawn from a gaussian distribution with variance σx. Each neuron is corrupted with gaussian noise ηi with variance ση,i∈{1,…,N}, which is assumed to be mutually

Approximation of the mutual information

The average mutual information is an information theoretic measure which quantifies the amount of information the output contains about the input [2]. The mutual information IMI between P(X) and P(Z) isIMI=H(Z)−H(Z|X)=−n=0NP(n)log2P(n)+−∞dxPX(x)n=0NP(n|x)log2P(n|x).Stocks demonstrates [7], that maximum information transfer occurs at an optimal noise level (σoptMI). To yield an analytical expression for the optimal noise level, we adapted an approximation introduced by Brunel and Nadal [1].

Results

In Fig. 2a the mutual information and IF are plotted against σ=σn/σx for various N and Θ=0. The curves indicate that for increasing N,IF approximates the mutual information. Note that IF can be negative, because the approximation is good only in the case F(x)⪢1. That means, that the estimator is sharply peaked around its mean value. Furthermore, Fig. 2a displays, that the optimal noise level of the Fisher information σoptF is independent from the number of threshold devices, where as the

Discussion

In this model study we examined stochastic resonance in a summing parallel threshold array. We compared optimal noise levels of the mutual information IMI between a distribution of inputs and a corresponding distribution of outputs, and IF, what is an approximation to IMI. The quality of this approximation increases with the number N of parallel nonlinearities. The optimal noise level σoptMI for IMI depends on N, thus a neuron that should adapt to the optimal noise level needs to have some

Acknowledgements

Supported by Wellcome Trust (061113/Z/00).

References (8)

  • N. Brunel et al.

    Mutual information, Fisher information, and population coding

    Neural Comput.

    (1998)
  • T.M. Cover et al.

    Elements of Information Theory, Wiley Series in Telecommunications

    (1991)
  • L. Gammaitoni et al.

    Stochastic resonance

    Rev. Modern Phys.

    (1998)
  • D.F. Russel et al.

    Use of behavioral stochastic resonance by paddle fish for feeding

    Nature

    (1999)
There are more references available in the full text version of this article.

Cited by (17)

  • Optimising threshold levels for information transmission in binary threshold networks: Independent multiplicative noise on each threshold

    2015, Physica A: Statistical Mechanics and its Applications
    Citation Excerpt :

    In Ref. [15], each threshold operates on the same Gaussian signal, but is subject to independent and identically distributed (i.i.d.) additive Gaussian noise, and the SSR effect is measured using mutual information. Since this seminal work, SSR has been extensively investigated for the case where all thresholds have the same value [17–27]. Closely related work has focused on the interesting case when threshold devices are replaced by identical neuron models [28–33].

  • Noise enhancement of signal transduction by parallel arrays of nonlinear neurons with threshold and saturation

    2007, Neurocomputing
    Citation Excerpt :

    In neuronal arrays, Collins et al. [11], Chialvo et al. [9], and Hoch et al. [18,19] show stochastic resonance essentially with a subthreshold input signal, while Stocks [24], Stocks and Mannella [27], and Hoch et al. [17] show the novel form of suprathreshold stochastic resonance. In the neuronal arrays, suprathreshold stochastic resonance is shown in [24,17] with simple threshold binary neurons, meanwhile Collins et al. [11], and Stocks and Mannella [27] for this investigate an excitable FitzHugh–Nagumo model in its subthreshold and suprathreshold regimes. For isolated nonlinear systems, it has recently been shown that stochastic resonance can also operate in threshold-free nonlinearities with saturation, where the noise has the ability to reduce the distortion experienced by a signal because of the saturation [23], with an extension to arrays of threshold-free sensors with saturation given in [8].

  • Suprathreshold stochastic resonance in neural processing tuned by correlation

    2011, Physical Review E - Statistical, Nonlinear, and Soft Matter Physics
View all citing articles on Scopus
View full text