Elsevier

Knowledge-Based Systems

Volume 80, May 2015, Pages 24-33
Knowledge-Based Systems

Evolving connectionist systems for adaptive learning and knowledge discovery: Trends and directions

https://doi.org/10.1016/j.knosys.2014.12.032Get rights and content

Abstract

This paper follows the 25 years of development of methods and systems for knowledge-based neural network systems and more specifically the recent evolving connectionist systems (ECOS). ECOS combine the adaptive/evolving learning ability of neural networks and the approximate reasoning and linguistically meaningful explanation features of symbolic representation, such as fuzzy rules. This review paper presents the classical now hybrid expert systems and evolving neuro-fuzzy systems, along with new developments in spiking neural networks, neurogenetic systems, and quantum inspired systems, all discussed from the point of few of their adaptability, model interpretability and knowledge discovery. The paper discusses new directions for the integration of principles from neural networks, fuzzy systems, bio- and neuroinformatics, and nature in general.

Section snippets

Hybrid connectionist systems

The human brain uniquely combines low level neuronal learning in the neurons and the connections between them and higher level rule abstraction leading to adaptive learning and abstract concept formation. This is the ultimate inspiration for the development of hybrid connectionist systems where specially constructed artificial neural networks (NN) are trained on data so that after training abstract knowledge representation can be derived that explains the data and can be further interpreted as

Fuzzy neurons and fuzzy neural networks. Evolving connectionist systems

A low-level integration of fuzzy rules into a single neuron model and larger neural network structures, tightly coupling learning and fuzzy reasoning rules into connectionists structures, was initiated by Professor Takeshi Yamakawa and other Japanese scientists and promoted at a series of IIZUKA conferences in Japan [39]. Since then, many models of fuzzy neural networks were developed based on these principles [8], [19], [23].

The evolving neuro-fuzzy systems developed these ideas further, where

Current trends in ECOS: Evolving spiking neural networks (eSNN)

A single biological neuron and the associated synapses is a complex information processing machine that involves short term information processing, long term information storage, and evolutionary information stored as genes in the nucleus of the neuron. A spiking neuron model assumes input information represented as trains of spikes over time. When sufficient input information is accumulated in the membrane of the neuron and the neuron’s post synaptic potential exceeds a threshold, the neuron

A current trend and a future direction: Evolving Computational Neuro-Genetic Models (eCNGM)

A neurogenetic model of a neuron is proposed and studied in [4]; Kasabov, 2010). It utilises information about how some proteins and genes affect the spiking activities of a neuron such as fast excitation, fast inhibition, slow excitation, and slow inhibition. An important part of the model is a dynamic gene/protein regulatory network (GRN) model of the dynamic interactions between genes/proteins over time that affect the spiking activity of the neuron – Fig. 9.

New types of neuro-genetic fuzzy

A current trend and a future direction: Quantum inspired eSNN (QeSNN)

QeSNNs use the principle of superposition of states to represent and optimize features (input variables) and gene parameters of an eSNN model [22]. They are optimized through quantum inspired genetic algorithm [6] or QiPSO. Features or genes are represented as qu-bits in a superposition of 1 (selected), with a probability α, and 0 (not selected) with a probability β. When the model has to be calculated, the quantum bits ‘collapse’ in a state of either 1 or 0. Fuzzy rules in QeSNN look like:IF<

A current trend and a future direction: The NeuCube eSNN spatio-temporal data machine

The latest development in the direction of eSNN and neurogenetic systems was proposed as a new architecture of a virtual spatio-temporal data machine called NeuCube [27]. It was initially proposed for spatio-temporal brain data modelling, but then it was used for climate data modelling, stroke occurrence prediction and other applications.

The NeuCube framework is depicted in Fig. 11. It consists of the following modules:

  • Input information encoding module.

  • 3D SNN module (the Cube).

  • Output module.

Conclusion

This paper presents an overview of trends and directions of evolving connectionist systems (ECOS). The main goal of ECOS is to facilitate the creation of computational models and systems for adaptive learning and knowledge discovery from complex data. ECOS principles are derived from the integration of principles from neural networks, fuzzy systems, evolutionary computation, quantum computing and brain information processing. ECOS applications are manifold, but perhaps most welcome in the

Acknowledgement

The work on this paper is supported by the Knowledge Engineering and Discovery Research Institute (KEDRI, http://www.kedri.aut.ac.nz). I was helped with the organization of this paper by Joyce D’Mello. More papers, data and software systems can be found at: http://www.kedri.aut.ac.nz, and: http://ncs.ethz/projects/evospike/.

References (41)

  • P. Angelov

    Evolving Rule-based Models: A Tool for Design of Flexible Adaptive Systems

    (2002)
  • L. Benuskova et al.

    Computational Neuro-genetic Modelling

    (2007)
  • M. Defoin-Platel et al.

    Quantum-inspired evolutionary algorithm: a multi-model EDA

    IEEE Trans. Evol. Comput.

    (2009)
  • Knowledge-Based Systems Journal, Expert Systems, vol. 1, no. 2, 1988, p....
  • T. Furuhashi, T. Hasegawa, S. Horikawa, Y. Uchikawa, An Adaptive Fuzzy Controller Using Fuzzy Neural Networks, in:...
  • M. Futschik, N. Kasabov, Fuzzy clustering in gene expression data analysis, in: Proc. of the World Congress of...
  • W. Gerstner

    Time structure of the activity of neural network models

    Phys. Rev.

    (1995)
  • D. Hebb

    The Organization of Behavior

    (1949)
  • A.L. Hodgkin et al.

    A quantitative description of membrane current and its application to conduction and excitation in nerve

    J. Physiol.

    (1952)
  • Cited by (29)

    • Unsupervised Anomaly Detection in Stream Data with Online Evolving Spiking Neural Networks

      2021, Neural Networks
      Citation Excerpt :

      Depending on the type of input data, the transformation can be carried out by means of the temporal encoding methods such as Step-Forward or Threshold-Based (Maciąg et al., 2019; Petro et al., 2019) or, alternatively, with the use of Gaussian Receptive Fields (Lobo et al., 2018). The distinctive feature of the eSNN is that its repository of output neurons evolves during the training phase based on candidate output neurons that are created for every new input data sample (Kasabov, 2015; Kasabov et al., 2013). More specifically, for each new input value, a new candidate output neuron is created and is either added to the output repository or, based on the provided similarity threshold, is merged with one of the output neurons contained in the repository.

    • IT2-GSETSK: An evolving interval Type-II TSK fuzzy neural system for online modeling of noisy data

      2020, Neurocomputing
      Citation Excerpt :

      However as highlighted in [6], application of IT2FNN in online and evolving modeling is still in the nascent stage and needs more research. Evolving FNNs [7,8] have specific characteristics which are desirable for real-life applications. In non-stationary or evolving environment [9], the distribution of the data related to the behavior of a specific phenomenon changes gradually to abrupt and beyond its historical bounds.

    • An end-to-end functional spiking model for sequential feature learning

      2020, Knowledge-Based Systems
      Citation Excerpt :

      Inspired by which, the spiking neural network (SNN) and neuromorphic low-power system are proposed and have gained a lot of attention in recent years [3–5]. These SNN based systems have been successfully applied to various fields, such as pattern recognition [6–10], aggregate label learning [11,12], neuromorphic chips [13,14], etc. [15]. One of the important bases for these achievements is to utilize an effective SNN model.

    • Long-term learning for type-2 neural-fuzzy systems

      2019, Fuzzy Sets and Systems
      Citation Excerpt :

      In the literature, a number of adaptive neural fuzzy systems have been developed to deal with such problems, for example online learning [8], incremental learning [9,10], lifelong learning [11], and knowledge-based learning neural networks [12]. Recent research in online learning concentrates on adaptive learning rates to follow time-varying distributions [5,12,13]. Incremental learning is the process of repeatedly training a model with new data without destroying the old prototype patterns [14], while lifelong learning also termed long-term or “continuous learning” addresses learning through the entire lifespan of a system [15].

    • Clustering-based undersampling in class-imbalanced data

      2017, Information Sciences
      Citation Excerpt :

      The threshold method involves setting different threshold values for different classes during the classifier learning stage [37], whereas the one-class learning method entails training the classifier from a training set that contains only one specific class [10,30]. Other types of algorithms, such as evolving clustering in neurofuzzy systems, evolving clustering of dynamic data in spiking neural networks, clustering personalized modeling, and clustering through quantum-inspired evolutionary algorithms, have also been developed to deal with imbalanced data [18–20,28] The cost-sensitive solutions focus on defining different misclassification costs of classifiers for different classes.

    View all citing articles on Scopus
    View full text