Skip to main content

Second Order Back Propagation Neural Network (SOBPNN) Algorithm for Medical Data Classification

  • Conference paper
Computational Intelligence in Information Systems

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 331))

Abstract

Gradient based methods are one of the most widely used error minimization methods used to train back propagation neural networks (BPNN). Some second order learning methods deal with a quadratic approximation of the error function determined from the calculation of the Hessian matrix, and achieves improved convergence rates in many cases. This paper introduces an improved second order back propagation which calculates efficiently the Hessian matrix by adaptively modifying the search direction. This paper suggests a simple modification to the initial search direction, i.e. the gradient of error with respect to weights, can substantially improve the training efficiency. The efficiency of the proposed SOBPNN is verified by means of simulations on five medical data classification. The results show that the SOBPNN significantly improves the learning performance of BPNN.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Zhang, W.J.: Computational Ecology: Artificial Neural Networks and Their Applications, pp. 41–47. World Scientific Publishing Co. Pte. Ltd., Singapore (2010)

    Book  Google Scholar 

  2. Chandra, P., Singh, Y.: An activation function adapting training algorithm for sigmoidal feedforward networks. Neurocomputing 61, 429–437 (2004)

    Article  Google Scholar 

  3. Yu, X.H., Chen, G.A., Cheng, S.X.: Acceleration of backpropagation learning using optimized learning rate and momentum. Electronics Letters 29(14), 1288–1289 (1993)

    Article  Google Scholar 

  4. Hamid, N.A., Nawi, N.M., Ghazali, R.: The Effect of Adaptive Gain and Adaptive Momentum in Improving Training Time of Gradient Descent Back Propagation Algorithm on Classification Problems. International Journal on Advanced Science, Engineering and Information Technology 1(2), 178–184 (2011)

    Google Scholar 

  5. Hamid, N.A., Nawi, N.M., Ghazali, R., Salleh, M.N.M.: Solving Local Minima Problem in Back Propagation Algorithm Using Adaptive Gain, Adaptive Momentum and Adaptive Learning Rate on Classification Problems. International Journal of Modern Physics: Conference Series 9, 448–455 (2012)

    Google Scholar 

  6. Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press (1995)

    Google Scholar 

  7. Nawi, N.M., Ransing, M.R., Ransing, R.S.: An improved Conjugate Gradient based learning algorithm for back propagation neural networks. International Journal of Computational Intelligence 4(1), 46–55 (2007)

    Google Scholar 

  8. Wang, Z., Fang, J., Liu, X.: Global stability of stochastic high-order neural networks with discrete and distributed delays. Chaos, Solitons & Fractals 36(2), 388–396 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  9. Thimm, G., Moerland, P., Fiesler, E.: The Interchangeability of Learning Rate and Gain in Backpropagation Neural Networks. Neural Comput. 8, 451–460 (1996)

    Article  Google Scholar 

  10. Holger, R.M., Graeme, C.D.: The Effect of Internal Parameters and Geometry on the Performance of Back propagation Neural Networks. Environmental Modeling and Software 13(1), 193–209 (1998)

    Google Scholar 

  11. Eom, K., Jung, K., Sirisena, H.: Performance Improvement of Backpropagation Algorithm by Automatic Activation Function Gain Tuning Using Fuzzy Logic. Neurocomputing 50, 439–460 (2003)

    Article  MATH  Google Scholar 

  12. Fletcher, R., Powell, M.J.D.: A rapidly convergent descent method for minimization. British Computer J., 163–168 (1963)

    Google Scholar 

  13. Fletcher, R., Reeves, R.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–160 (1964)

    Article  MATH  MathSciNet  Google Scholar 

  14. Huang, H.Y.: A unified approach to quadratically convergent algorithms for function minimization. J. Optim. Theory Appl. 5, 405–423 (1970)

    Article  MATH  Google Scholar 

  15. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning Internal Representations by Error Propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing, vol. 1, pp. 318–362 (1986)

    Google Scholar 

  16. Huzaifa, I., Asivadam, V.S., Saad, N.: Enhanced Conjugate Gradient Methods for Training MLP Networks. In: Proceedings of 2010 IEEE Student Conference on Research and Development (SCOReD 2010), Putrajaya, Malaysia (2010)

    Google Scholar 

  17. Hager, W., Zhang, H.: A Survey on Nonlinear Conjugate Gradient Methods. Pacific Journal of Optimization 2, 35–58 (2006)

    MATH  MathSciNet  Google Scholar 

  18. Nawi, N.M., Ghazali, R., Salleh, M.N.M.: An Approach to Improve Back-propagation algorithm by using Adaptive Gain. Biomedical Soft Computing and Human Sciences 16(2), 125–134 (2010)

    Google Scholar 

  19. Mangasarian, O.L., Wolberg, W.H.: Cancer diagnosis via linear programming. SIAM News 23, 1–18 (1990)

    Google Scholar 

  20. Smith, J.W., Everhart, J.E., Johannes, R.S.: Using the ADAP learning algorithm to forecast the onset of diabetes mellitus. In: Proceedings of the Symposium on Computer Applications and Medical Care, pp. 261–265 (1988)

    Google Scholar 

  21. Detrano, R., Janosi, A., Steinbrunn, W.: International application of a new probability algorithm for the diagnosis of coronary artery disease. American Journal of Cardiology 64, 304–310 (1989)

    Article  Google Scholar 

  22. Coomans, D., Broeckaert, I., Jonckheer, M., Massart, D.L.: Comparison of Multivariate Discrimination Techniques for Clinical Data - Application to The Thyroid Functional State. Methods of Information Medicine 22, 93–101 (1983)

    Google Scholar 

  23. Masri, S.K.: Pengelasan Simptom-simptom Barah Mulut Menggunakan Pendekatan Rangkaian Neural. Undergraduate Thesis. Universiti Tun Hussein Onn Malaysia (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nazri Mohd Nawi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Nawi, N.M., Hamid, N.A., Harsad, N., Ramli, A.A. (2015). Second Order Back Propagation Neural Network (SOBPNN) Algorithm for Medical Data Classification. In: Phon-Amnuaisuk, S., Au, T. (eds) Computational Intelligence in Information Systems. Advances in Intelligent Systems and Computing, vol 331. Springer, Cham. https://doi.org/10.1007/978-3-319-13153-5_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-13153-5_8

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-13152-8

  • Online ISBN: 978-3-319-13153-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics