Skip to main content

Hybrid Artificial Neural Networks: Models, Algorithms and Data

  • Conference paper
Advances in Computational Intelligence (IWANN 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6692))

Included in the following conference series:

Abstract

Artificial neural networks (ANNs) constitute a class of flexible nonlinear models designed to mimic biological neural systems. ANNs are one of the three main components of computational intelligence and, as such, they have been often hybridized from different perspectives. In this paper, a review of some of the main contributions for hybrid ANNs is given, considering three points of views: models, algorithms and data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Angeline, P.J., Sauders, G.M., Pollack, J.B.: An evolutionary algorithm that constructs recurren neural networks. IEEE Transactions on Neural Networks 5, 54–65 (1994)

    Article  Google Scholar 

  2. Back, T.: Evolutionary Algorithms in Theory and Practice, Oxford (1996)

    Google Scholar 

  3. Back, T., Fogel, D.B., Michalewicz, Z.: Handbook of Evolutionary Computation. IOP Publishing Ltd., Bristol (1997)

    Book  MATH  Google Scholar 

  4. Bishop, C.M.: Improving the generalization properties of radial basis function neural networks. Neural Computation 3(4), 579–581 (1991)

    Article  Google Scholar 

  5. Bishop, C.M.: Pattern Recognition and Machine Learning. Information Science and Statistics, 1st edn. Springer, Heidelberg (2006)

    MATH  Google Scholar 

  6. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  Google Scholar 

  7. Buchtala, O., Klimek, M., Sick, B.: Evolutionary optimization of radial basis function classifiers for data mining applications. IEEE Transactions on Systems, Man, and Cybernetics, Part B 35(5), 928–947 (2005)

    Article  Google Scholar 

  8. Burgess, N.: A constructive algorithm that converges for real-valued input patterns. Int. J. Neural. Syst. 5(1), 59–66 (1994)

    Article  MathSciNet  Google Scholar 

  9. Caruana, R.: Multitask learning. Machine Learning 28(1), 41–75 (1997)

    Article  MathSciNet  Google Scholar 

  10. Chaiyaratana, N., Piroonratana, T., Sangkawelert, N.: Effects of diversity control in single-objective and multi-objective genetic algorithms. Journal of Heuristics 13(1), 1–34 (2007)

    Article  Google Scholar 

  11. Cohen, S., Intrator, N.: A hybrid projection-based and radial basis function architecture: initial values and global optimisation. Pattern Analysis & Applications 5(2), 113–120 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  12. Cybenko, G.: Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals, and Systems 2(4), 303–314 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  13. Donoho, D.L., Johnstone, I.M.: Projection-based approximation and a duality with kernel methods. The Annals of Statistics 17(1), 58–106 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  14. Duch, W., Adamczak, R., Diercksen, G.: Constructive density estimation network based on several different separable transfer functions. In: Proceedings of the 9th European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 107–112 (2001)

    Google Scholar 

  15. Duch, W., Jankowski, N.: Transfer functions hidden possibilities for better neural networks. In: Proceedings of the 9th European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 81–94 (2001)

    Google Scholar 

  16. Durbin, R., Rumelhart, D.: Products units: A computationally powerful and biologically plausible extension to backpropagation networks. Neural Computation 1(1), 133–142 (1989)

    Article  Google Scholar 

  17. Fahlman, S.E.: An empirical study of learning speed in back-propagation networks. Technical report, cmu-cs-88-162, Carnegie-Mellon University (1988)

    Google Scholar 

  18. Fogel, D.: Using evolutionary programming to greater neural networks that are capable of playing tic-tac-toe. In: International Conference on Neural Networks, pp. 875–880. IEEE Press, San Francisco (1993)

    Chapter  Google Scholar 

  19. Fogel, D.: Evolutionary Computation: Toward a New Philosophy of Machine Intelligence. IEEE Press, New York (1995)

    MATH  Google Scholar 

  20. Fogel, L.J.: Artificial Intelligence through Simulated Evolution, 1st edn. John Wiley & Sons, New York (1966)

    MATH  Google Scholar 

  21. Frean, M.: The upstart algorithm: A method for constructing and training feedforward neural networks. Neural Computation 2, 198–209 (1990)

    Article  Google Scholar 

  22. Friedman, J.: Multivariate adaptive regression splines (with discussion). Annals of Statistics 19, 1–141 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  23. García-Padrajas, N., Hervás-Martínez, C., Muñoz-Pérez, J.: Covnet: A cooperative coevolutionary model for evolving artificial neural networks. IEEE Transaction on Neural Networks 14(3), 575–596 (2003)

    Google Scholar 

  24. Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley Professional, Reading (1989)

    MATH  Google Scholar 

  25. Gutiérrez, P.A., Hervás-Martínez, C., Carbonero, M., Fernández, J.C.: Combined projection and kernel basis functions for classification in evolutionary neural networks. Neurocomputing 27(13-15), 2731–2742 (2009)

    Google Scholar 

  26. Gutiérrez, P.A., Hervás-Martínez, C.,Martínez-Estudillo, F.J.: Logistic regression by means of evolutionary radial basis function neural networks. IEEE Transactions on Neural Networks 22(2), 246–263 (2011)

    Google Scholar 

  27. Gutiérrez, P.A., López-Granados, Peña-Barragán, J.M., Gómez-Casero, M.T., Hervás, C.: Mapping sunflower yield as affected by Ridolfia segetum patches and elevation by applying evolutionary product unit neural networks to remote sensed data. Computers and Electronics in Agriculture 60(2), 122–132 (2008)

    Google Scholar 

  28. Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning. Springer, Heidelberg (2001)

    Book  MATH  Google Scholar 

  29. Haykin, S.: Neural Networks: A comprehensive Foundation, 3rd edn. Prentice-Hall, Englewood Cliffs (2008)

    MATH  Google Scholar 

  30. Hecht-Nielsen, R.: Neurocomputing. Addison-Wesley, Reading (1990)

    Google Scholar 

  31. Hervas-Martinez, C., Garcia-Gimeno, R.M., Martinez-Estudillo, A.C., Martinez-Estudillo, F.J., Zurera-Cosano, G.: Improving microbial growth prediction by product unit neural networks. Journal of Food Science 71(2), M31–M38 (2006)

    Google Scholar 

  32. Hervas-Martínez, C., Martínez-Estudillo, F.J.: Logistic regression using covariates obtained by product-unit neural network models. Pattern Recognition 40(1), 52–64 (2007)

    Google Scholar 

  33. Hervas-Martínez, C., Martínez-Estudillo, F.J., Carbonero-Ruz, M.: Multilogistic regression by means of evolutionary product-unit neural networks. Neural Networks 21(7), 951–961 (2008)

    Google Scholar 

  34. Huang, G.B., Zhua, Q.Y., Siewa, C.K.: Extreme learning machine: Theory and applications. Neurocomputing 70(1–3), 489–501 (2006)

    Article  Google Scholar 

  35. Islam, M.M., Yao, X., Murase, K.: A constructive algorithm for training cooperative neural network ensembles. IEEE Transactions on Neural Networks 14(4), 820–834 (2003)

    Article  Google Scholar 

  36. Iulian, B.C.: Hybrid feedforward neural networks for solving classification problems. Neural Processing Letters 16(1), 81–91 (2002)

    Article  MATH  Google Scholar 

  37. Jadid, M.N., Fairbairn, D.R.: Neural-network applications in predicting moment-curvature parameters from experimental data. Engineering Applications of Artificial Intelligence 9(3), 309–319 (1996)

    Article  Google Scholar 

  38. Jankowski, N., Duch, W.: Optimal transfer function neural networks. In: Procedings of the 9th European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 101–106 (2001)

    Google Scholar 

  39. Jiang, L., Zhang, J., Allen, G.: Transferred correlation learning: An incremental scheme for neural network ensembles. In: Proceedings of the 2010 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE Press, Barcelona (2010)

    Google Scholar 

  40. Koza, J.R., Rice, J.P.: Genetic generation of both the weights and architecture for a neural network. In: Proceedings of International Joint Conference on Neural Networks, vol. 2, pp. 397–404. IEEE Press, Seattle (1991)

    Google Scholar 

  41. Le Cun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Advances in neural information processing systems, pp. 598–605. Morgan Kaufmann Publishers Inc., San Francisco (1990)

    Google Scholar 

  42. Lee, S.H., Hou, C.L.: An art-based construction of RBF networks. IEEE Transactions on Neural Networks 13(6), 1308–1321 (2002)

    Article  Google Scholar 

  43. Lehtokangas, M., Saarinen, J.: Centroid based multilayer perceptron networks. Neural Processing Letters 7, 101–106 (1998)

    Article  Google Scholar 

  44. Michalewicz, Z.: Genetic algorithms + data STRUCTURES = evolution programs. Springer, New York (1996)

    Book  MATH  Google Scholar 

  45. Miller, G.F., Todd, P.M., Hedge, S.U.: Designing neural networks using genetic algorithms. In: Schaffer, J.D. (ed.) Proceedings of the 3rd International Conference on Genetic Algorithms and Their Applications, pp. 379–384. Morgan Kaufmann, San Mateo (1989)

    Google Scholar 

  46. Palmes, P.P., Hayasaka, T., Usui, S.: Mutation-based genetic neural network. IEEE Transactions on Neural Networks 16(3), 587–600 (2005)

    Article  Google Scholar 

  47. Pao, Y.H., Takefuji, Y.: Functional-link net computing: theory, system architecture, and functionalities. IEEE Computer 25(5), 76–79 (1992)

    Article  Google Scholar 

  48. Reed, R.: Pruning algorithms. a survey. IEEE Transactions on Neural Networks 4, 740–747 (1993)

    Article  Google Scholar 

  49. Rokach, L.: Ensemble-based classifiers. Artificial Intelligence Review 33(1–2), 1–39 (2010)

    Article  Google Scholar 

  50. Schapire, R.: Theoretical views of boosting. In: Proceedings 4th European Conference on Computational Learning Theory, pp. 1–10 (1999)

    Google Scholar 

  51. Sutton, R.S.: Two problems with backpropagation and other steepest-descent learning procedures for networks. In: Proceedings of the 8th Annual Conference of the Cognitive Science Society (1986)

    Google Scholar 

  52. Vapnik, V., Vashist, A.: A new learning paradigm: Learning using privileged information. Neural Networks 22(5-6), 544–557 (2009)

    Article  MATH  Google Scholar 

  53. Wedge, D., Ingram, D., McLean, D., Mingham, C., Bandar, Z.: On global-local artificial neural networks for function approximation. IEEE Transactions on Neural Networks 17(4), 942–952 (2006)

    Article  Google Scholar 

  54. Yao, X.: Global optimization by evolutionary algorithms. In: Proceedings of the 2nd Aizu International Symposium on Parallel Algorithms/Architecutre Synthesis (pAs 1997), Aizu-Wakamatsu, Japan, pp. 282–291 (1997)

    Google Scholar 

  55. Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)

    Article  Google Scholar 

  56. Yao, X., Liu, Y.: Making use of population information in evolutionary artificial neural networks. IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics 28(3), 417–425 (1998)

    Google Scholar 

  57. Zhu, Q.Y., Qin, A.K., Suganthan, P.N., Huang, G.B.: Evolutionary extreme learning machine. Pattern Recognition 38(10), 1759–1763 (2005)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Gutiérrez, P.A., Hervás-Martínez, C. (2011). Hybrid Artificial Neural Networks: Models, Algorithms and Data. In: Cabestany, J., Rojas, I., Joya, G. (eds) Advances in Computational Intelligence. IWANN 2011. Lecture Notes in Computer Science, vol 6692. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21498-1_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21498-1_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21497-4

  • Online ISBN: 978-3-642-21498-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics