Abstract
Artificial neural networks (ANNs) constitute a class of flexible nonlinear models designed to mimic biological neural systems. ANNs are one of the three main components of computational intelligence and, as such, they have been often hybridized from different perspectives. In this paper, a review of some of the main contributions for hybrid ANNs is given, considering three points of views: models, algorithms and data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Angeline, P.J., Sauders, G.M., Pollack, J.B.: An evolutionary algorithm that constructs recurren neural networks. IEEE Transactions on Neural Networks 5, 54–65 (1994)
Back, T.: Evolutionary Algorithms in Theory and Practice, Oxford (1996)
Back, T., Fogel, D.B., Michalewicz, Z.: Handbook of Evolutionary Computation. IOP Publishing Ltd., Bristol (1997)
Bishop, C.M.: Improving the generalization properties of radial basis function neural networks. Neural Computation 3(4), 579–581 (1991)
Bishop, C.M.: Pattern Recognition and Machine Learning. Information Science and Statistics, 1st edn. Springer, Heidelberg (2006)
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Buchtala, O., Klimek, M., Sick, B.: Evolutionary optimization of radial basis function classifiers for data mining applications. IEEE Transactions on Systems, Man, and Cybernetics, Part B 35(5), 928–947 (2005)
Burgess, N.: A constructive algorithm that converges for real-valued input patterns. Int. J. Neural. Syst. 5(1), 59–66 (1994)
Caruana, R.: Multitask learning. Machine Learning 28(1), 41–75 (1997)
Chaiyaratana, N., Piroonratana, T., Sangkawelert, N.: Effects of diversity control in single-objective and multi-objective genetic algorithms. Journal of Heuristics 13(1), 1–34 (2007)
Cohen, S., Intrator, N.: A hybrid projection-based and radial basis function architecture: initial values and global optimisation. Pattern Analysis & Applications 5(2), 113–120 (2002)
Cybenko, G.: Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals, and Systems 2(4), 303–314 (1989)
Donoho, D.L., Johnstone, I.M.: Projection-based approximation and a duality with kernel methods. The Annals of Statistics 17(1), 58–106 (1989)
Duch, W., Adamczak, R., Diercksen, G.: Constructive density estimation network based on several different separable transfer functions. In: Proceedings of the 9th European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 107–112 (2001)
Duch, W., Jankowski, N.: Transfer functions hidden possibilities for better neural networks. In: Proceedings of the 9th European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 81–94 (2001)
Durbin, R., Rumelhart, D.: Products units: A computationally powerful and biologically plausible extension to backpropagation networks. Neural Computation 1(1), 133–142 (1989)
Fahlman, S.E.: An empirical study of learning speed in back-propagation networks. Technical report, cmu-cs-88-162, Carnegie-Mellon University (1988)
Fogel, D.: Using evolutionary programming to greater neural networks that are capable of playing tic-tac-toe. In: International Conference on Neural Networks, pp. 875–880. IEEE Press, San Francisco (1993)
Fogel, D.: Evolutionary Computation: Toward a New Philosophy of Machine Intelligence. IEEE Press, New York (1995)
Fogel, L.J.: Artificial Intelligence through Simulated Evolution, 1st edn. John Wiley & Sons, New York (1966)
Frean, M.: The upstart algorithm: A method for constructing and training feedforward neural networks. Neural Computation 2, 198–209 (1990)
Friedman, J.: Multivariate adaptive regression splines (with discussion). Annals of Statistics 19, 1–141 (1991)
García-Padrajas, N., Hervás-Martínez, C., Muñoz-Pérez, J.: Covnet: A cooperative coevolutionary model for evolving artificial neural networks. IEEE Transaction on Neural Networks 14(3), 575–596 (2003)
Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley Professional, Reading (1989)
Gutiérrez, P.A., Hervás-Martínez, C., Carbonero, M., Fernández, J.C.: Combined projection and kernel basis functions for classification in evolutionary neural networks. Neurocomputing 27(13-15), 2731–2742 (2009)
Gutiérrez, P.A., Hervás-Martínez, C.,Martínez-Estudillo, F.J.: Logistic regression by means of evolutionary radial basis function neural networks. IEEE Transactions on Neural Networks 22(2), 246–263 (2011)
Gutiérrez, P.A., López-Granados, Peña-Barragán, J.M., Gómez-Casero, M.T., Hervás, C.: Mapping sunflower yield as affected by Ridolfia segetum patches and elevation by applying evolutionary product unit neural networks to remote sensed data. Computers and Electronics in Agriculture 60(2), 122–132 (2008)
Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning. Springer, Heidelberg (2001)
Haykin, S.: Neural Networks: A comprehensive Foundation, 3rd edn. Prentice-Hall, Englewood Cliffs (2008)
Hecht-Nielsen, R.: Neurocomputing. Addison-Wesley, Reading (1990)
Hervas-Martinez, C., Garcia-Gimeno, R.M., Martinez-Estudillo, A.C., Martinez-Estudillo, F.J., Zurera-Cosano, G.: Improving microbial growth prediction by product unit neural networks. Journal of Food Science 71(2), M31–M38 (2006)
Hervas-Martínez, C., Martínez-Estudillo, F.J.: Logistic regression using covariates obtained by product-unit neural network models. Pattern Recognition 40(1), 52–64 (2007)
Hervas-Martínez, C., Martínez-Estudillo, F.J., Carbonero-Ruz, M.: Multilogistic regression by means of evolutionary product-unit neural networks. Neural Networks 21(7), 951–961 (2008)
Huang, G.B., Zhua, Q.Y., Siewa, C.K.: Extreme learning machine: Theory and applications. Neurocomputing 70(1–3), 489–501 (2006)
Islam, M.M., Yao, X., Murase, K.: A constructive algorithm for training cooperative neural network ensembles. IEEE Transactions on Neural Networks 14(4), 820–834 (2003)
Iulian, B.C.: Hybrid feedforward neural networks for solving classification problems. Neural Processing Letters 16(1), 81–91 (2002)
Jadid, M.N., Fairbairn, D.R.: Neural-network applications in predicting moment-curvature parameters from experimental data. Engineering Applications of Artificial Intelligence 9(3), 309–319 (1996)
Jankowski, N., Duch, W.: Optimal transfer function neural networks. In: Procedings of the 9th European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 101–106 (2001)
Jiang, L., Zhang, J., Allen, G.: Transferred correlation learning: An incremental scheme for neural network ensembles. In: Proceedings of the 2010 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE Press, Barcelona (2010)
Koza, J.R., Rice, J.P.: Genetic generation of both the weights and architecture for a neural network. In: Proceedings of International Joint Conference on Neural Networks, vol. 2, pp. 397–404. IEEE Press, Seattle (1991)
Le Cun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Advances in neural information processing systems, pp. 598–605. Morgan Kaufmann Publishers Inc., San Francisco (1990)
Lee, S.H., Hou, C.L.: An art-based construction of RBF networks. IEEE Transactions on Neural Networks 13(6), 1308–1321 (2002)
Lehtokangas, M., Saarinen, J.: Centroid based multilayer perceptron networks. Neural Processing Letters 7, 101–106 (1998)
Michalewicz, Z.: Genetic algorithms + data STRUCTURES = evolution programs. Springer, New York (1996)
Miller, G.F., Todd, P.M., Hedge, S.U.: Designing neural networks using genetic algorithms. In: Schaffer, J.D. (ed.) Proceedings of the 3rd International Conference on Genetic Algorithms and Their Applications, pp. 379–384. Morgan Kaufmann, San Mateo (1989)
Palmes, P.P., Hayasaka, T., Usui, S.: Mutation-based genetic neural network. IEEE Transactions on Neural Networks 16(3), 587–600 (2005)
Pao, Y.H., Takefuji, Y.: Functional-link net computing: theory, system architecture, and functionalities. IEEE Computer 25(5), 76–79 (1992)
Reed, R.: Pruning algorithms. a survey. IEEE Transactions on Neural Networks 4, 740–747 (1993)
Rokach, L.: Ensemble-based classifiers. Artificial Intelligence Review 33(1–2), 1–39 (2010)
Schapire, R.: Theoretical views of boosting. In: Proceedings 4th European Conference on Computational Learning Theory, pp. 1–10 (1999)
Sutton, R.S.: Two problems with backpropagation and other steepest-descent learning procedures for networks. In: Proceedings of the 8th Annual Conference of the Cognitive Science Society (1986)
Vapnik, V., Vashist, A.: A new learning paradigm: Learning using privileged information. Neural Networks 22(5-6), 544–557 (2009)
Wedge, D., Ingram, D., McLean, D., Mingham, C., Bandar, Z.: On global-local artificial neural networks for function approximation. IEEE Transactions on Neural Networks 17(4), 942–952 (2006)
Yao, X.: Global optimization by evolutionary algorithms. In: Proceedings of the 2nd Aizu International Symposium on Parallel Algorithms/Architecutre Synthesis (pAs 1997), Aizu-Wakamatsu, Japan, pp. 282–291 (1997)
Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)
Yao, X., Liu, Y.: Making use of population information in evolutionary artificial neural networks. IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics 28(3), 417–425 (1998)
Zhu, Q.Y., Qin, A.K., Suganthan, P.N., Huang, G.B.: Evolutionary extreme learning machine. Pattern Recognition 38(10), 1759–1763 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Gutiérrez, P.A., Hervás-Martínez, C. (2011). Hybrid Artificial Neural Networks: Models, Algorithms and Data. In: Cabestany, J., Rojas, I., Joya, G. (eds) Advances in Computational Intelligence. IWANN 2011. Lecture Notes in Computer Science, vol 6692. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21498-1_23
Download citation
DOI: https://doi.org/10.1007/978-3-642-21498-1_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21497-4
Online ISBN: 978-3-642-21498-1
eBook Packages: Computer ScienceComputer Science (R0)