Elsevier

Neurocomputing

Volumes 26–27, June 1999, Pages 199-206
Neurocomputing

Dynamic synapse: Harnessing the computing power of synaptic dynamics

https://doi.org/10.1016/S0925-2312(99)00063-6Get rights and content

Abstract

A major unresolved issues in neuroscience is the emergence of functional capability from underlying molecular and cellular mechanisms. The concept of a dynamic synapse is extended to provide a formalism to incorporate synaptic mechanisms into a general scheme of neural information processing: A synapse is composed of two distinct functional units, presynaptic terminals for transforming the sequence of action potentials into multiple sequences of discrete release events, and postsynaptic components for combining such synaptic signals. The complex interaction of various cellular and molecular processes in synapses can be concisely expressed and interpreted in these two synaptic terms. Learning involves modifying the synaptic dynamics such that each axon terminal performs proper transformation function.

Introduction

The brain derives its functional capacities from a vast number of underlying cellular and molecular mechanisms and their complex interactions. For instance, the existence of various dynamic processes in the axon terminal, including paired-pulse facilitation, augmentation, post-tetanus potentiation, depression, etc., has been well documented [4], [11], [21]. The process of neurotransmitter release is mediated by a complex molecular machinery [15], [19]. Furthermore, these molecular mechanisms themselves are dynamic, under constant regulation and modulation [9], [18]. Given such an immense complexity, how can we elucidate the role of synaptic dynamics in neural function? An even more tantalizing question is how can such regulation or modulation (e.g., the phosphorylation of say synaptotagmin, a protein in the axon terminal) be harnessed for learning and memory? Due to the complexity of the issue, the functional role of synaptic dynamics in neural information processing remains largely unexplored, except for a few studies (e.g. [16], [20]). In this article, we propose a formalism based on the concept of a dynamic synapse [6] for incorporating synaptic mechanisms into the general scheme of information processing in the brain.

The dynamic synapse proposed by Liaw and Berger (1996) embodies three major concepts. First, an axon terminal transforms a sequence of action potentials into another sequence of discrete release events. This is a functional consequence of the existence of multiple excitatory and inhibitory mechanisms with different time constants in the terminal. Moreover, once released, the rise and fall of the concentration of neurotransmitter in the synaptic cleft is very rapid, with a decay time constant of 1–2 ms [2]. Thus, a release event can be regarded as a “spike”, much the same way as an action potential is referred to. Second, an array of different sequences of release events can be generated by the axon terminals of a single neuron. This is a result of variations in the synaptic mechanisms across different presynaptic terminals [5], [17] In other words, each axon terminal acts as an independent communication channel such that information is coded in the spatio-temporal patterns of release events, instead of a single sequence of action potentials. Third, during learning, presynaptic mechanisms are modified such that each axon terminal performs proper pattern transformation function while changes in postsynaptic mechanisms determine how the synaptic signals are to be combined. The computational capability of synaptic dynamics has been demonstrated by achieving speaker-independent word recognition from raw waveforms by a small and simple neural network incorporating dynamic synapses [6], [7]. The dynamic synapse neural network exhibits a high degree of robustness against noise [8].

More recently, there is an increased interest in the functional role of synaptic dynamics [10], [12], [13], [14], [16]. For example, concepts similar to the first and second points of the aforementioned dynamic synapse has been proposed in [10], [14]. Maass and Zador (1998) incorporated the processes of facilitation and depression in their “dynamic stochastic synapse” as a basic computational unit in neural networks. Markram and colleagues (1998) also include facilitation and depression in their notion of “utilization of synaptic efficacy” and the first two points of dynamic synapse were clearly echoed in their proposed concept of “differential signaling” as a basis for information processing within networks of neurons.

Here we describe (1) how various synaptic mechanisms can be incorporated into networks of neurons via dynamic synapses as a way of understanding their functional role in neural information processing, and (2) how the synaptic dynamics can be modified as a basis of learning and memory.

Section snippets

Dynamic synapse formalism

A straightforward implementation of the dynamic synapse is by expressing the potential of neurotransmitter release as a weighted sum of the activation of presynaptic mechanisms:Ri(t)=mKi,m(t)Fi,m(Ap(t)),where Ri(t) is the potential for release for presynaptic terminal i at time t. Fi,m is the mth mechanism in terminal i, Ki,m is the co-efficient (weight) for mechanism Fi,m, and Ap indicates the occurrence (Ap=1) or non-occurrence (Ap=0) of an action potential. Any number of synaptic mechanisms

Dynamic learning algorithm

The process of neurotransmitter release is dynamically mediated by a complex molecular and cellular mechanisms. Furthermore, these mechanisms themselves are dynamic, under constant regulation and modulation [9], [18]. How can such a second-order regulation be expressed mathematically? An even more intriguing question is how can such regulation be utilized for learning and memory? One plausible answer is to modify the presynaptic mechanisms by the correlation of presynaptic and postsynaptic

Speech recognition case study

We have demonstrated the computational capability of dynamic synapses by performing speech recognition from unprocessed, noisy raw waveforms of words spoken by multiple speakers with a simple neural network consisting of a small number of neurons connected by dynamic synapses [6], [7]. The neural network is organized into two layers, an input and an output layer, with five neurons in each layer, plus one inhibitory interneuron, each modeled as an “integrate-and-fire” neuron. The input neurons

Discussion

The brain embodies an enormous coding capacity to represent an almost unlimited variations in the signals from the external world and a tremendous computational power to robustly process such noisy signals to identify invariants therein to generate a timely reaction. A major unresolved issue in neuroscience is how such functional capacities of the brain depend on (emerge from) underlying cellular and molecular mechanisms. This issue remains unresolved because of the number and complexity of

Acknowledgments

Supported by ONR, NIMH, DARPA, and NCRR.

Jim-Shih Liaw is a research assistant professor of the Biomedical Engineering Department at the University of Southern California. He received his Ph.D. in computer science from USC in 1993. His research is on how higher neural functions emerge from the underlying molecular and cellular mechanisms and their complex interaction.

References (21)

There are more references available in the full text version of this article.

Cited by (32)

  • Intelligence Science: Leading the Age of Intelligence

    2021, Intelligence Science: Leading the Age of Intelligence
  • Synaptic dynamics: Linear model and adaptation algorithm

    2014, Neural Networks
    Citation Excerpt :

    Synaptic transmission is a dynamic process; it is regulated by a variety of short-term and long-term processes in which interplay of the processes defines the synaptic response (Abbott et al., 1997; Dittman, Kreitzer, & Regehr, 2000; Liaw & Berger, 1996, 1999; Maass & Zador, 1999; Mongillo, Barak, & Tsodyks, 2008; Tsodyks, Pawelzik, & Markram, 1998; Zucker & Regehr, 2002).

  • Exponential transient propagating oscillations in a ring of spiking neurons with unidirectional slow inhibitory synaptic coupling

    2011, Journal of Theoretical Biology
    Citation Excerpt :

    Short term synaptic plasticity has been first observed at the frog neuromuscular junction (Feng, 1941; Eccles et al., 1941; Zengel and Magleby, 1982) and then in cortical neurons in mammalian brains (Thomson and Deuchars, 1994; Varela et al., 1997; Beierlein et al., 2003). Further, slow changes in synaptic activity depending on the history, frequency and timing of presynaptic spikes are considered to play important roles in information processing in the central nervous system, which are referred to as dynamic synapses (Abbott and Regehr, 2004; Markram and Tsodyks, 1996; Tsodyks and Markram, 1997; Abbott et al., 1997; Zador and Dobrunz, 1997; Markram et al., 1998; Tsodyks et al., 1998; Liaw and Berger, 1999). Short term facilitation has been employed in models for synchronous firing in groups of neurons (Tsodyks et al., 2000), slow oscillations in neural networks (Melamed et al., 2008) and delay compensation related to the flash-lag effect (Lim and Choe, 2008), for instance.

  • Facilitating neural dynamics for delay compensation: A road to predictive neural dynamics?

    2009, Neural Networks
    Citation Excerpt :

    Lim and Choe suggested a neural dynamic model for delay compensation using Facilitating Activity Network (FAN) based on short-term plasticity in the neuron known as facilitating synapses (Lim, 2006; Lim & Choe, 2006b). Facilitating synapses have been found at a single neuron level in which the membrane potential shows a dynamic sensitivity to the changing rate of the input (Liaw & Berger, 1999; Lim, 2006). As illustrated in Fig. 1, the original signal can be recovered from the delayed signal by using facilitating dynamics.

  • A neural network based obstacle-navigation animat in a virtual environment

    2002, Engineering Applications of Artificial Intelligence
View all citing articles on Scopus

Jim-Shih Liaw is a research assistant professor of the Biomedical Engineering Department at the University of Southern California. He received his Ph.D. in computer science from USC in 1993. His research is on how higher neural functions emerge from the underlying molecular and cellular mechanisms and their complex interaction.

View full text