Skip to main content

Advertisement

Log in

Evolutionary Radial Basis Functions for Credit Assessment

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Credit analysts generally assess the risk of credit applications based on their previous experience. They frequently employ quantitative methods to this end. Among the methods used, Artificial Neural Networks have been particularly successful and have been incorporated into several computational tools. However, the design of efficient Artificial Neural Networks is largely affected by the definition of adequate values for their free parameters. This article discusses a new approach to the design of a particular Artificial Neural Networks model, RBF networks, through Genetic Algorithms. It presents an overall view of the problems involved and the different approaches employed to optimize Artificial Neural Networks genetically. For such, several methods proposed in the literature for optimizing RBF networks using Genetic Algorithms are discussed. Finally, the model proposed by the authors is described and experimental results using this model for a credit risk assessment problem are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. H. White, “Economics prediction using neural networks: The case of IBM daily stock return,” in Proceedings of IEEE International Conference on Neural Networks, 1996, pp. II-451–II-459.

  2. L. Richeson, R. Zimmerman, and K. Barnett, “Predicting consumer credit performance: Can neural networks outperform traditional statistical methods?” International Journal of Applied Expert Systems, vol. 2, no. 2, pp. 116–130, 1994.

    Google Scholar 

  3. R. Trippi and E. Turban, Neural Networks in Finance and Investing. Irwin Professional Publishing, 1996.

  4. E. Mendes Filho, A. de Carvalho, and A. Matias, “Credit assessment using evolutionary MLP networks,” inDecision Technologies for Computational Finance, Proceedings of the fifth International Conference on Computational Finance, CF’97, Advances in Computational Management Science, Kluwer Academic Publishers, 1997, pp. 365–371.

  5. E. Nikbakht and M. Taft, “Application of expert systems in evaluation of credit card borrowers,” Managerial Finance, vol. 15, no. 5, pp. 19–27, 1997.

    Google Scholar 

  6. A. Atiya, “Bankruptcy prediction for credit risk using neural networks: A survey and new results,” IEEE Transactions on Neural Networks, vol. 12, no. 4, pp. 929–935, 2001.

    Google Scholar 

  7. D.S. Broomhead and D. Lowe, “Multivariable functional interpolation and adaptive networks,” Complex Systems, vol. 2, pp. 321–355, 1988.

    Google Scholar 

  8. S. Chen, C.F.N. Cowan, and P.M. Grant, “Orthogonal least squares learning algorithm for radial basis function networks,” IEEE Transactions on Neural Networks, vol. 2, no. 2, pp. 302–309, 1991.

    Google Scholar 

  9. J. Platt, “A resource-allocating network for function interpolation,” Neural Computation, vol. 3, no. 2, pp. 213–225, 1991.

    Google Scholar 

  10. M.J.L. Orr, “Regularisation in the selection of radial basis function centres,” Neural Computation, vol. 7, no. 3, pp. 606–623, 1995.

    Google Scholar 

  11. S.A. Billings and G.L. Zheng, “Radial basis function network configuration using genetic algorithms,” Neural Networks, vol. 8, no. 6, pp. 877–890, 1995.

    Google Scholar 

  12. B. Carse and T.C. Fogarty, “Fast evolutionary learning of minimal radial basis function neural networks using a genetic algorithm,” in AISB Workshop on Evolutionary Computing, edited by T.C. Forgaty, Lectures Notes in Computer Science, no. 1143, Springer-Verlag, 1996, pp. 1–22.

  13. S. Chen, Y. Wu, and K. Alkadhimi, “A two-Layer learning method for radial basis function networks using combined genetic and regularised OLS algorithms,” in Proceedings of the 1st IEE/IEEE International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications, 1995, pp. 245–249.

  14. R.A. Teixeira, A.P. Braga, R.H.C. Takahashi, and R.R. Saldanha, “Improving generalization of MLPs with multiobjective optimization,” Neurocomputing, vol. 35, pp. 189–194, 2000.

    Google Scholar 

  15. H. Simon, The New Science of Management Decision, Harper and Row: New York, 1960.

    Google Scholar 

  16. D. Hawley, J. Johnson, and D. Raina, “Artificial neural systems: a new tool for financial decision making,” in neural networks in finance and investing, edited by r. trippi and e. turban, revised edition, new york, irwin, 1996, pp. 25–44.

    Google Scholar 

  17. C. Carter and J. Catlett. “Assessing credit card applications using machine learning,” in IEEE Expert, 1987, pp. 71–79.

  18. D. Hand and W. Henley, “Statistical classification methods in consumer credit scoring: A review,” in Journal of the Royal Statistical Society, Series A, vol. 160, pp. 523–541, 1997.

    Google Scholar 

  19. D. Hand and W. Henley, “Some developments in statistical credit scoring,” in Machine Learning and Statistics: The Interface, edited by G. Nakhaeizadeh and C. Taylor, John Wiley & Sons, 1997, pp. 221–237.

  20. H. Sousa and A. de Carvalho, “Credit analysis using constructive neural networks,” in Proceedings of the 3rd International Conference on Computational Intelligence and Multimedia Applications, ICCIMA’99, IEEE Computer Press: New Deli, 1999, pp. 40–44.

    Google Scholar 

  21. E. Reategui, J.A. Campbell, and S. Borghetti, “Using a neural network to learn general knowledge in a Case-Based System,” in Proceedings of First International Conference on Case-Based Reasoning, edited by A. Aamodt and M. Veloso, 1995, pp. 528–537.

  22. E. Lacerda, and A. de Carvalho, “Credit analysis using radial basis function networks,” in Proceedings of the 3rd International Conference on Computational Intelligence and Multimedia Applications, ICCIMA’99, IEEE Computer Press: New Delhi, 1999, pp. 138–142.

    Google Scholar 

  23. P. Horst, T. Padilha, C. Rocha, S. Rezende, and A. de Carvalho, “Knowledge acquisition using symbolic and connectionist algorithms for credit evaluation,” in Proceedings of the IEEE World Congress on Computational Intelligence, WCCI’98, CD media, Anchorage, USA, 1998.

    Google Scholar 

  24. S. Geman, E. Bienenstock, and R. Doursat, “Neural networks and the bias-variance dilemma,” Neural Computation, vol. 4, pp. 1–58, 1992.

    Google Scholar 

  25. J. MacQueen, “Some methods for classification and analysis of multivariate Observations,” in Proceedings of the Fifth Berkley Symposium Math. Stat. Prob., vol. 1, pp. 281–297, 1967.

    Google Scholar 

  26. T. Kohonen, “Self-organized formation of topologically correct feacture maps,” Biological Cybernetics, vol. 43, pp. 59–69, 1982.

    Google Scholar 

  27. M. Kubat, “Decision trees can initialize radial-basis function networks,” IEEE Transactions on Neural Networks, vol. 9, no. 5, pp. 813–821, 1998.

    Google Scholar 

  28. M.T. Musavi, W. Ahmed, K.B. Faris, and D.M. Hummels, “On the training of radial basis function (RBF) Classifiers,” Neural Networks, vol. 5, no. 4, pp. 595–603, Elsevier, 1992.

    Google Scholar 

  29. A. Roy, S. Govil, and R. Miranda, “An algorithm to generate radial basis function (rbf)-Like Nets for Classification Problems,” Neural Networks, vol. 8, no. 2, pp. 179–201, Elsevier, 1995.

    Google Scholar 

  30. J. Moody and C.J. Darken, “Fast learning in networks of locally-tuned processing units,” Neural Computation, vol. 1, no. 2, pp. 281–294, 1989.

    Google Scholar 

  31. A. Saha and J.D. Keller, “Algorithms for better representation and faster learning in radial basis function networks,” in Advances in Neural Information Processing Systems, edited by D.S. Touretzki, vol. 2, 1990, pp. 482-489.

  32. W.H. Press, B.P. Flannery, S.A. Teukolsky, and W.T. Vetterling, Numerical Recipes in C. Cambridge University Press, 1988.

  33. J.H. Holland, Adaptation in Natural and Artificial Systems. University of Michigan Press: Ann Arbor, 1975.

    Google Scholar 

  34. D.E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley, 1989.

  35. K. Balakrishnan and V. Honavar, “Properties of genetic representations of neural architectures,” in Proceedings of the World Congress on Neural Networks (WCNN’95), Washington, D.C., July 17–21, 1995, pp. 807–813.

  36. M. Gen and R. Cheng, Genetic Algorithms and Engineering Optimization, Wiley, 2000.

  37. S.A. Harp, T. Samad, and A. Guha, “Towards the genetic synthesis of Neural Networks,” in Proceedings of the 4th International Conference on Genetic Algorithms, Morgan Kaufmann, 1991, pp. 360–369.

  38. B. Sendhoff, M. Kreutz, and W. von Seelen, “A condition for the genotype-phenotype mapping: Causality,” in Proceedings of the Seventh International Conference on Genetic Algorithms (ICGA’97), edited by T. Bäck, Morgan Kauffman: San Francisco, 1997, pp. 354–361.

    Google Scholar 

  39. J.D. Schaffer, D. Whitley, and L.J. Eschelman, “Combinations of genetic algorithms and neural networks: A survey of the state of the art,” in Proceedings of the International Workshop on Combinations of Genetic Algorithms and Neural Networks (COGANN-92), IEEE, 1992, pp. 1–37.

  40. R. Neruda, “Functional equivalence and genetic learning of RBF networks,” Artificial Neural Nets and Genetic Algorithms, edited by D.W. Pearson, N.C. Steele and R.F. Albrecht, Springer-Verlag, pp. 53–56, 1995.

  41. J.D. Schaffer, D. Whitley, and L.J. Eshelman, “Combinations of genetic algorithms and neural networks: A survey of the state of the art,” in Proc. of the Conf. on Combinations of Genetic Algorithms and Neural Networks, 1992, pp. 1–37.

  42. P.J.B. Hancock, “Genetic algorithms and permutation problems: A comparison of recombination operators for neural net structure specification,” in Proceedings of the IEEE Workshop on Combinations of Genetic Algorithms and Neural Networks, 1992, pp. 108–122.

  43. C.B. Lucasius and G. Kateman, “Towards solving subset selection problems with the aid of the genetic algorithm, in Parallel Problem Solving from Nature, edited by R. Manner and B. Manderick, vol. 2, Elsevier Science Publishers: Amsterdam, 1992.

    Google Scholar 

  44. H. Akaike, “A new look at the statistical model identification,” IEEE Transactions on Automatic Control, vol. 19, pp. 716–723, 1974.

    Google Scholar 

  45. E.P. Maillard and D. Gueriot, “RBF neural network, basis functions and genetic algorithm,” in Proceedings of International Conference on Neural Networks, vol. 4, pp. 2187–2192, 1997.

    Google Scholar 

  46. B.A. Whitehead and T.D. Choate, “Evolving space-fiiling curves to distribute radial basis functions over an input space,” IEEE Transactions on Neural Networks, vol. 5, pp. 15–23, 1994.

    Google Scholar 

  47. B.A. Whitehead and T.D. Choate, “Cooperative-competitive genetic evolution of radial basis function centers and widths for time series prediction,” IEEE Transactions on Neural Networks, vol. 7, pp. 869–880, 1996.

    Google Scholar 

  48. G.H. Golub, M. Heath, and G. Wahba, “Generalised cross-validation as a method for choosing a good ridge parameter,” Technometrics, vol. 21, no. 2, pp. 215–223, 1979.

    Google Scholar 

  49. M.J.L. Orr, “Introduction to radial basis function networks,” TR Centre for Cognitive Science, Univ. of Edinburgh, Scotland, 1996.

    Google Scholar 

  50. C.M. Fonseca and P.J. Fleming, “Genetic algorithms for multiobjective optimization: Formulation, discussion and generalization,” in Proceedings of the 5th International Conference on Genetic Algorithms, Morgan Kaufmann Publishers, Inc.: San Mateo, 1993, pp. 416–423.

    Google Scholar 

  51. J. Baker, “Reducing bias and inefficiency in the selection algorithm,” in Proc. of the Second International Conference on Genetic Algorithms and their Applications, edited by J. Grefenstette, Lawrence Erlbaum Associates: Hillsdale, New Jersey, 1987, pp. 14–21.

    Google Scholar 

  52. L. Prechelt, “Proben1–-A set of neural networks benchmark problems and Benchmarking Rules,” Fakultät füt Informatik, Universität Karlsruhe, Technical Report 21/94, 1994.

  53. C.A. Murphy and D.W. Aha, “UCI repository of machine learning databases,” irvine, CA, University of California, 1994.

  54. S.P. Lloyd, “Least square quantization in PCM,” IEEE Transactions on Information Theory, vol. 28, no. 2, pp. 129–137. 1982.

    Google Scholar 

  55. R.O. Duda and P.E. Hart, Pattern Classification and Scene Analysis, Wiley-Interscience Publication, 1973.

  56. M.A. Ismail, S.Z. Selim, and S.K. Arora, “Efficient clustering of multidimensional data,” in Proceedings of the IEEE International Conference on Systems Man and Cybernetics, 1984, pp. 120–123.

  57. M.A. Ismail and M.S. Kamel, “Multidimensional data clustering utilizing hybrid search strategies,” Pattern Recognition, vol. 22, pp. 75–89, 1989.

    Google Scholar 

  58. C. Chinrungrueng and C.H. Séquin, “Optimal adaptive k-means algorithm with dynamic adjustment of learning rate,” IEEE Transactions on Neural Networks, vol. 6, pp. 157–169, 1995.

    Google Scholar 

  59. D.E. Rumelhart, G.E. Hinton, and R.J. Williams, “Learning internal representation by error propagation,” in Parallel Distributed Processing, MIT Press: Cambridge, 1986, pp. 318–362.

    Google Scholar 

  60. S.E. Fahlman and C. Lebiere, “The cascade-correlation learning architecture,” technical report. School of Computer Science. Carnegie Mellon University, 1991.

  61. R. Parekh, J. Yang, and V. Honavar, “Constructive neural network learning algorithms for multi-category Real-Valued Pattern Classification,” Technical Report. Department of Computer Science. Iowa State University, 1997.

  62. B. Scholkopf, Support Vector Learning. R. Oldenburg Verlag, 1997.

  63. R. Mason, R. Gunst, and J. Hess, Statistical Design and Analysis of Experiments, John Wiley & Sons, 1989.

  64. S.R. Sexton and J.N.D. Gupta, “Comparative evaluation of genetic algorithm and backpropagation for training neural networks,” Information Sciences, vol. 129, pp. 45–59, 2000.

    Google Scholar 

  65. J.J. Grefenstette, “Optimization of control parameters for genetic algorithms 1986,” IEEE Trans SMC, vol. 16, pp. 122– 128.

    Google Scholar 

  66. B.S. Everitt and G. Dunn, Applied Multivariate Data Analysis, John Wiley & Sons, 1991.

  67. J.R. Quinlan, “Discovering rules from large collection of examples: A case study,” in Expert Systems in the Microelectronic Age, edited by, D. Michie, 1979.

  68. R.M. Cameron-Jones and J.R. Quinlan, “First order learning, zeroth order data,” in Proceedings of the AI’93 Australian Joint Conference on Artificial Intelligence, World Scientific: Melbourne, 1993.

    Google Scholar 

  69. J.R. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann, 1993.

  70. J.R. Quinlan, “Learning logical definitions from relations,” Machine Learning, vol. 5, pp. 239–266, 1990.

    Google Scholar 

  71. M.C. Mackey and L. Glass, “Oscillations and chaos in physiological control systems, Science, pp. 197–287, 1977.

  72. D.L. Bailey and D.M. Thompson, “Developing neural network applications,” AI Expert, vol. 5, no. 9, pp. 34–41, 1990.

    Google Scholar 

  73. T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Springer, 2002.

  74. E. Mendes Filho and A. de Carvalho, “Evolutionary design of mlp neural network architectures,” in proceedings of the iv Brazilian Symposium on Neural Networks, IV SBRN, IEEE Computer Press, 1997, pp. 58–65.

  75. L. Prechelt, “Automatic early stopping using cross-validation: Quantifying the criteria,” h Neural Networks, 1998.

  76. B. Widrow and M.E. Hoff, “Adaptive switching circuits,” in IRE-WESCON Convention Record, vol. 4, pp. 96–104, 1960, New York, edited by J.A. Anderson and E. Rosenfeld. Neurocomputing: Foundations of Research. MIT Press: Cambridge, MA, 1988.

  77. D. Whitley, “The GENITOR algorithm and selective pressure,” in Proc. of the Third Int. Conf. on Genetic Algorithms and their Applications, edited by J. Schaffer, Morgan Kaufmann: San Mateo, CA, 1989, pp. 116–121.

    Google Scholar 

  78. A. Zell et al., “SNNS: Stuttgart neural network simulator user’s manual version 4.1. no. 6/95,” 1995. (http://www.informatik.unistuttgart.de/ipvr/bv/projekte/snns/).

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Estefane Lacerda.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lacerda, E., Carvalho, A.C.P.L.F., Braga, A.P. et al. Evolutionary Radial Basis Functions for Credit Assessment. Appl Intell 22, 167–181 (2005). https://doi.org/10.1007/s10791-005-6617-0

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10791-005-6617-0

Navigation