Skip to main content

Comparing Stochastic Gradient Descent and Mini-batch Gradient Descent Algorithms in Loan Risk Assessment

  • Conference paper
  • First Online:
Informatics and Intelligent Applications (ICIIA 2021)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1547))

Included in the following conference series:

  • 516 Accesses

Abstract

Despite the increase in the practice and the advent of artificial neural networks (ANNs) and deep learning (DL) to reduce the number of slippage loans in the banking and financial industries, some of the Nigerian banks and financial institutions are not aware of its efficacy. Loan request assessment would expand credit pronouncement efficiency, and inadvertently save the time and cost associated with loan analysis. This research considered, compared, and contrasted two artificial neural networks algorithms –the Mini-batch (Normal) gradient descent and the stochastic gradient descent algorithms. Each of the algorithms was separately used with back-propagation neural networks to develop loan evaluation models. Samples were collected from a Nigerian bank, these samples contain an index of default and non-default customers, and were used to train the loan evaluation models. The outcomes of the research show that the stochastic gradient descent algorithm outperforms the mini-batch gradient descent algorithm in terms of the percentage accuracy (0.863 and 0.835 respectively) and the space complexity, while the mini-batch gradient descent algorithm performed better in time complexity (107 µs and 1 ms respectively). The stochastic gradient descent algorithm was superior in identifying borrowers that were default with 87% accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Grégoire, M., Geneviève, B., Klaus-Robert, M.: Neural Networks: Tricks of the Trade, vol. 7700(2), pp. 1611–3349. Springer (2012). https://doi.org/10.1007/978-3-642-35289-8

  2. Wei, W., Zhou, B., Maskeliūnas, R., Damaševičius, R., Połap, D., Woźniak, M.: Iterative design and implementation of rapid gradient descent method. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J. (eds) Artificial Intelligence and Soft Computing. ICAISC 2019. Lecture Notes in Computer Science, vol 11508. Springer, Cham (2019)

    Google Scholar 

  3. Ha, S., Nguyen, H.: Credit scoring with a feature selection approach based on deep learning. In: MATEC Web of Conferences, vol. 54, 11–12 (2016). https://www.researchgate.net/publication/301594739_Credit_scoring_with_a_feature_selection_approach_based_deep_learning

  4. Zeidan, R., Boechat, C., Fleury, A.: Developing a sustainability credit score system. J. Bus. Ethics 127(2), 283–296 (2014). https://doi.org/10.1007/s10551-013-2034-2

    Article  Google Scholar 

  5. Crouhy, M., Galai, D., Mark, R.: A comparative analysis of current credit risk models. J. Bank. Finance 24(1–2), 59–117 (2000)

    Article  Google Scholar 

  6. Jin, L., Jiang, S.: Comparison of gradient descent and least squares algorithms in deep model. In: Journal of Physics: Conference Series, vol. 1621, 1742–6596 (2020). https://www.researchgate.net/publication/344564611_Comparison_of_Gradient_Descent_and_Least_Squares_Algorithms_in_Deep_Model

  7. Addo, P., Guegan, D., Hassani, B.: Credit risk analysis using machine and deep learning models. Risks 6(2), 38 (2018)

    Article  Google Scholar 

  8. Conor, M.: Machine learning fundamentals. Cost functions and gradient descent. https://towardsdatascience.com/machine-learning-fundamentals-via-linear-regression-41a5d11f5220. Accessed 09 July 2021

  9. Tiwari, K., Chong, N.: Gradient descent. Multi-robot Explor. Environ. Monitor. 1, 41–52 (2020). https://doi.org/10.1016/B978-0-12-817607-8.00018-6

    Article  Google Scholar 

  10. Kirill, E.: Neural Networks in Python from Scratch: Complete Guide. https://www.superdatascience.com/courses/neural-networks-python, Accessed 09 Jun 2021

  11. Michael, N.: Artificial Intelligence, 3e. University of Tasmania, school of electrical engineering and computer science (2012)

    Google Scholar 

  12. Wenrui, H.: A gradient descent method for solving a system of nonlinear equations. Appl. Math. Lett. 112, 1–8 (2021)

    MathSciNet  MATH  Google Scholar 

  13. Eveline, N.: Credit risk management in banks as participants in financial markets. A qualitative study of the perception of bank managers in Sweden (Umeå region). Thesis, Umeå School of Business, Master thesis, 30 hp (2010). https://www.diva-portal.org/smash/get/diva2:441943/FULLTEXT02

  14. Zhao, J., Zhang, R., Zhou, Z., Chen, S., Jin, J., Liu, Q.: A neural architecture search method based on gradient descent for remaining useful life estimation. Neurocomputing 438, 184–194 (2021)

    Google Scholar 

  15. Barani, F., Savadi, A., Yazdi, H.: Convergence behavior of diffusion stochastic gradient descent algorithm. Signal Process. 183, 108014 (2021)

    Article  Google Scholar 

  16. Leo, M., Sharma, S., Maddulety, K.: Machine learning in banking risk management: a literature review. Risks. 7(1), 29 (2019)

    Article  Google Scholar 

  17. Addo, P., Guegan, D., Hassani, B.: Credit Risk Analysis Using Machine and Deep Learning Models. SSRN Electr. J. 10, 2139 (2018)

    Google Scholar 

  18. Rehman, G., Syed, M., Mohd, N.: Improving the accuracy of gradient descent back propagation algorithm (GDAM) on classification problems. Int. J. New Comput. Arch. Appl. 4(4), 861–870 (2011)

    Google Scholar 

  19. Deng, C., Lin, H.: Progressive and iterative approximation for least-squares B-spline curve and surface fitting. Comput. Aided Des. 47(1), 32–44 (2014)

    Google Scholar 

  20. Rios, D., Jüttler, B.: LSPIA, (stochastic) gradient descent, and parameter correction, 113921, 1–18 (2021)

    Google Scholar 

  21. Julia, K.: Credit Rating. https://www.investopedia.com/terms/c/creditrating.asp. Assessed 09 July 2021

  22. Misra, S.: A step by step guide for choosing project topics and writing research papers in ICT related disciplines. In: Misra, S., Muhammad-Bello, B. (eds.) Information and Communication Technology and Applications. ICTA 2020. Communications in Computer and Information Science, vol. 1350. Springer, Cham (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chika Yinka-Banjo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Adigun, A.A., Yinka-Banjo, C. (2022). Comparing Stochastic Gradient Descent and Mini-batch Gradient Descent Algorithms in Loan Risk Assessment. In: Misra, S., Oluranti, J., Damaševičius, R., Maskeliunas, R. (eds) Informatics and Intelligent Applications. ICIIA 2021. Communications in Computer and Information Science, vol 1547. Springer, Cham. https://doi.org/10.1007/978-3-030-95630-1_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-95630-1_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-95629-5

  • Online ISBN: 978-3-030-95630-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics