The classification capability of a dynamic threshold neural network

https://doi.org/10.1016/0167-8655(94)90090-6Get rights and content

Abstract

This paper proposes a new type of neural network called the Dynamic Threshold Neural Network (DTNN). Through theoretical analysis, we prove that the classification capability of a DTNN can be twice as effective as a conventional sigmoidal multi-layer neural network in classification capability. In other words, to successfully learn an arbitrarily given training set, a DTNN may need as little as half the number of free parameters required by a sigmoidal multilayer neural network.

References (6)

  • E.B. Baum et al.

    What size net gives valid generalization?

    Neural Comput.

    (1989)
  • C.C. Chiang et al.

    A variant of second-order multilayer perceptron and its application to function approximations

    C.C. Chiang et al.

    A variant of second-order multilayer perceptron and its application to function approximations

  • S.C. Huang et al.

    Bounds on the number of hidden neurons in multilayer perceptrons

    IEEE Trans. Neural Networks

    (1991)
There are more references available in the full text version of this article.
View full text