Skip to main content
Log in

Neural networks, linear functions and neglected non-linearity

  • Published:
Computational Management Science Aims and scope Submit manuscript

Abstract.

The multiplicity of approximation theorems for Neural Networks do not relate to approximation of linear functions per se. The problem for the network is to construct a linear function by superpositions of non-linear activation functions such as the sigmoid function. This issue is important for applications of NNs in statistical tests for neglected nonlinearity, where it is common practice to include a linear function through skip-layer connections. Our theoretical analysis and evidence point in a similar direction, suggesting that the network can in fact provide linear approximations without additional ‘assistance’. Our paper suggests that skip-layer connections are unnecessary, and if employed could lead to misleading results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to B. Curry.

Additional information

Received: August 2002, Revised: March 2003,

AMS Classification:

82c32

The authors are grateful to Prof. Mick Silver and to GFK Marketing for help with the provision of data.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Curry, B., Morgan, P.H. Neural networks, linear functions and neglected non-linearity. Computational Management Science 1, 15–29 (2003). https://doi.org/10.1007/s10287-003-0003-4

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10287-003-0003-4

Keywords: