Skip to main content
Log in

Evaluation of neural network performance and generalisation using thresholding functions

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

The application of a simple thresholding technique to help assess the satisfactory performance of classification networks formed from Multi-Layer Perceptron (MLP) artificial neural networks (ANNs) is discussed. Both conventional Maximum Likelihood and Bayesian Evidence based training paradigms were implemented. Firstly a simulated data set drawn from a two-dimensional Gaussian distribution was investigated to illustrate the physical significance of the threshold plots compared to the classifier output probability contours. Secondly a real world application data set comprising of low-frequency vibration measurements on an aircraft wing (a GNAT trainer) is considered. It is demonstrated that simple threshold based plots applied to classifier network outputs may provide a simple yet powerful technique to aid in the rejection of poorly regularised network structures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

References

  1. Lippmann RP (1989) Pattern classification using neural networks. IEEE Commun Mag 47(11):47–64

    Article  Google Scholar 

  2. Zhang GP (2000) Neural networks for classification: a survey. IEEE Trans Systems, Man and Cybernetics-Part C: Applications and Reviews 30(4):451–462

    Google Scholar 

  3. Lisboa PJG (2002) A review of evidence of health benefit from artificial neural networks in medical intervention. Neural Netw 15:11–39

    Article  PubMed  Google Scholar 

  4. Berardi VL, Patuwo BE, Hu MY (2004) A principled approach for building and evaluating neural network classification models. Decis Support Syst 38:233–246

    Article  Google Scholar 

  5. Augusteijn MF, Folkert BA (2002) Neural network classification and novelty detection. Int J Remote Sens 23(14):2891–2902

    Article  Google Scholar 

  6. Masnata A, Sunseri M (1996) Neural network classification of flaws detected by ultrasonic means. NDT&E Int 29(2):97–93

    Google Scholar 

  7. Manson G, Worden K, Allman D (2003a) Experimental validation of a structural health monitoring methodology. Part II. Novelty detection on a Gnat aircraft. J Sound Vib 259(2):345–363

    Article  Google Scholar 

  8. Manson G, Worden K, Allman D (2003b) Experimental validation of a structural health monitoring methodology. Part III. Damage location on an aircraft wing. J Sound Vib 259(2):365–385

    Article  Google Scholar 

  9. Dreiseitl S, Ohno-Machado L (2002) Logistic regression and artificial neural network classification models: a methodology review. J Biomed Inform 35:352–359

    Article  PubMed  Google Scholar 

  10. Haykin S (1999) Neural networks, a comprehensive foundation, 2nd edn. Prentice Hall, Englewood Cliffs

    MATH  Google Scholar 

  11. Schwarzer G, Vach W, Schumacher M (2000) On the misuses of artificial neural networks for prognostic and diagnostic classification in oncology. Stat Med 19:541–561

    Article  PubMed  Google Scholar 

  12. Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, New York

    Google Scholar 

  13. Nabney IT (2002) Netlab-algorithms for pattern recognition. Springer, Berlin Heidelberg New York

    MATH  Google Scholar 

  14. MacKay DJC (2003) Information theory, inference, and learning algorithms. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  15. MacKay DJC (1992) A practical Bayesian framework for back-propagation networks. Neural Comput 4:448–472

    Google Scholar 

  16. Worden K, Tomlinson GR (2001) Nonlinearity in structural dynamics. Institute of Physics Publishing

Download references

Acknowledgements

This work was supported by EPSRC grant number GR/R96415/01 in association with DSTL Farnborough who are acknowledged for sample provision and assistance with data collection. The authors gratefully acknowledge the use of the software package NETLAB developed by Ian Nabney of Aston University [http://www.ncrg.aston.ac.uk/netlab/].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. G. Pierce.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Pierce, S.G., Worden, K. & Manson, G. Evaluation of neural network performance and generalisation using thresholding functions. Neural Comput & Applic 16, 109–124 (2007). https://doi.org/10.1007/s00521-006-0044-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-006-0044-z

Keywords

Navigation