Skip to main content

A New Supervised Classification of Credit Approval Data via the Hybridized RBF Neural Network Model Using Information Complexity

  • Conference paper
Book cover Data Science, Learning by Latent Structures, and Knowledge Discovery

Abstract

In this paper, we introduce a new approach for supervised classification to handle mixed-data (i.e., categorical, binary, and continuous) data structures using a hybrid radial basis function neural networks (HRBF-NN). HRBF-NN supervised classification combines regression trees, ridge regression, and the genetic algorithm (GA) with radial basis function (RBF) neural networks (NN) along with information complexity (ICOMP) criterion as the fitness function to carry out both classification and subset selection of best predictors which discriminate between the classes. In this manner, we reduce the dimensionality of the data and at the same time improve classification accuracy of the fitted predictive model. We apply HRBF-NN supervised classification to a real benchmark credit approval mixed-data set to classify the customers into good/bad classes for credit approval. Our results show the excellent performance of HRBF-NN method in supervised classification tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In B. H. Petrox & F. Csaki, (Eds.), Second International Symposium on Information Theory (pp. 267–281). Budapest: Academiai Kiado.

    Google Scholar 

  • Akbilgic, O. (2011). Variable selection and prediction using hybrid radial basis function neural networks: A case study on stock markets. PhD thesis, Istanbul University.

    Google Scholar 

  • Akbilgic, O., & Bozdogan, H. (2011). Predictive subset selection using regression trees and rbf neural networks hybridized with the genetic algorithm. European Journal of Pure and Applied Mathematics, 4(4), 467–485.

    MathSciNet  Google Scholar 

  • Akbilgic, O., Bozdogan, H., & Balaban, M. E. (2013). A novel hybrid RBF neural network model as a forecaster. Statistics and Computing. doi:10.1007/s11222-013-9375-7.

    Google Scholar 

  • Anderson, R. (2007). The credit scoring toolkit. Oxford: Oxford University Press.

    Google Scholar 

  • Bishop, C. M. (1991). Improving the generalization properties of radial basis function neural networks. Neural Computation, 3(4), 579–588.

    Article  Google Scholar 

  • Bishop, C. M. (1995). Neural networks for pattern recognition. Oxford: Oxford University Press.

    Google Scholar 

  • Bozdogan, H. (1987). Model selection and Akaike’s information criterion (AIC): The general theory and it’s analytical extension. Journal of Mathematical Psychology, 52(3), 345–370.

    MATH  MathSciNet  Google Scholar 

  • Bozdogan, H. (1994). Mixture-model cluster analysis using a new informational complexity and model selection criteria. In H. Bozdogan (Ed.), Multivariate Statistical Modeling, Proceedings of the First US/Japan Conference on the Frontiers of Statistical Modeling: An Informational Approach (Vol. 2, pp. 69–113). North-Holland: Springer

    Google Scholar 

  • Bozdogan, H. (2000). Akaike’s information criterion and recent developments in informational complexity. Journal of Mathematical Psychology, 44, 62–91.

    Article  MATH  MathSciNet  Google Scholar 

  • Bozdogan, H. (2004) Intelligent statistical data mining with information complexity and genetic algorithms. In H. Bozdogan (Ed.) Statistical data mining and knowledge discovery (pp. 15–56). Boca Raton: Chapman and Hall/CRC

    Google Scholar 

  • Breiman, L., Freidman, J., Stone, J. C., & Olsen, R. A. (1984). Classification and regression trees. Boca Raton: Chapman and Hall.

    MATH  Google Scholar 

  • Credit Approval Data Set by UCI MAchine Learning Repository. http://archive.ics.uci.edu/ml/datasets/Credit+Approval. Cited April 26, 2013

  • David, W. C., & Alice, E. S. (1996). Reliability optimization of series-parallel systems using a genetic algorithm. IEEE Transactions on Reliability, 45(2), 254–266.

    Article  Google Scholar 

  • Eiben, A. E., & Smith, J. E. (2010). Introduction to evolutionary computing. New York: Springer.

    Google Scholar 

  • Flach, P. A., Hernandez-Orallo, J., & Ferri, C. (2013). Comparing apples and oranges: Towards commensurate evaluation metrics in classification. Keynote lecture presented in the European Conference on Data Analysis (ECDA-2013), Luxembourg.

    Google Scholar 

  • Hoerl, A. E., Kennard, R. W., & Baldwin, K. F. (1975). Ridge regression: Some simulations. Communications in Statistics, 4, 105–123.

    Article  MATH  Google Scholar 

  • Kubat, M. (1998). Decision trees can initialize radial basis function networks. Transactions on Neural Networks, 9(5), 813–821.

    Article  MathSciNet  Google Scholar 

  • Kullback, A., & Leibler, R. (1951). On information and sufficiency. Annals of Mathematical Statistics, 22, 79–86.

    Article  MATH  MathSciNet  Google Scholar 

  • Liu, Z., & Bozdogan, H. (2004) Improving the performance of radial basis function classification using information criteria. In H. Bozdogan (Ed.), Statistical data mining and knowledge discovery (pp. 193–216). Boca Raton: Chapman and Hall/CRC.

    Google Scholar 

  • Orr, M. (2000). Combining regression trees and RBFs. International Journal of Neural Systems, 10(6), 453–465.

    Article  Google Scholar 

  • Rissanen, J. (1978). Modeling by shortest data description. Automatica, 14(5), 465–471.

    Article  MATH  Google Scholar 

  • Schwarz, G. (1978). Estimating the dimension of model. Annals of Statistics, 6, 461–464.

    Article  MATH  MathSciNet  Google Scholar 

  • Sutton, C. D. (2005). Classification and regression trees, bagging, and boosting. In Handbook of statistics Vol. 24, pp. 303–329. Elsevier B.V. doi: 10.1016/s0169-716(04)24004-4.

    Google Scholar 

  • Tikhonov, A. H., & Arsenin, V. Y. (1977). Solutions of ill-posed problems. New York: Wiley.

    MATH  Google Scholar 

  • White, H. (1982). Maximum likelihood estimation of misspecified models. Econometrica, 50, 1–25.

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgements

This paper was invited as a keynote presentation by Prof. Bozdogan at the European Conference on Data Analysis (ECDA-2013) at the University of Luxembourg in Luxembourg during July 10–12, 2013. Prof. Bozdogan extents his gratitude to the conference organizers: Professors Sabine Krolak-Schwerdt, Matthias Bömer, and Berthold Lausen.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hamparsum Bozdogan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Akbilgic, O., Bozdogan, H. (2015). A New Supervised Classification of Credit Approval Data via the Hybridized RBF Neural Network Model Using Information Complexity. In: Lausen, B., Krolak-Schwerdt, S., Böhmer, M. (eds) Data Science, Learning by Latent Structures, and Knowledge Discovery. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44983-7_2

Download citation

Publish with us

Policies and ethics