Skip to main content
Log in

Some new neural network architectures with improved learning schemes

  • Original paper
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

 Here, we present two new neuron model architectures and one modified form of existing standard feedforward architecture (MSTD). Both the new models use self-scaling scaled conjugate gradient algorithm (SSCGA) and lambda–gamma (L–G) algorithm and entail the properties of basic as well as higher order neurons (i.e., multiplication and the aggregation functions). Of these two, compensatory neural network architecture (CNNA) requires relatively smaller number of inter-neuronal connections, cuts down on the computational budget by almost 50% and speeds up convergence, besides, gives better training and prediction accuracy. The second model sigma–pi–sigma (SPS) ensures faster convergence, better training and prediction accuracy. The third model (MSTD) performs much better than the standard feedforward architecture (STD). The effect of normalizing the outputs for training also studied here shows virtually no improvement, at low iteration level, say ∼500, with increasing range of scaling. Increasing the number of neurons beyond a point also shows to have little effect in the case of higher order neuron.The numerous simulation runs for the problem of satellite orbit determination and the complex XOR problems establishes the robustness of the proposed neuron models architectures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sinha, M., Kumar, K. & Kalra, P. Some new neural network architectures with improved learning schemes. Soft Computing 4, 214–223 (2000). https://doi.org/10.1007/s005000000057

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s005000000057

Navigation