Abstract
Neural network learning is the main essence of ANN. There are many problems associated with the multiple local minima in neural networks. Global optimization methods are capable of finding global optimal solution. In this paper we investigate and present a comparative study for the effects of probabilistic and deterministic global search method for artificial neural network using fully connected feed forward multi-layered perceptron architecture. We investigate two probabilistic global search method namely Genetic algorithm and Simulated annealing method and a deterministic cutting angle method to find weights in neural network. Experiments were carried out on UCI benchmark dataset.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Whittle, P.: Prediction and regularization by linear least square methods. Van Nostrand, Princeton (1963)
Goggin, S.D., Gustafson, K.E., Johnson, K.M.: An asymptotic singular value decomposition analysis of nonlinear multilayer neural networks. In: International Joint Conference on Neural Networks, pp. I-785–I-789 (1991)
Burton, S.A.: A matrix method for optimizing a neural network. Neural comput. 3(3)
Lawrence, S., Giles, C.L., Tsoi, A.C.: What size neural network gives optimal generalization? Convergence properties of backpropagatioin. UMIACS-TR-96-22
Duch, W., Korczak, J.: Optimization and global minimization methods suitable for neural networks. Neural computing surveys (1999)
Phansalkar, V.V., Thathachar, M.A.L.: Local and Global Optimization Algorithms for Generalized Learning Automata. Neural Computation 7, 950–973 (1995)
Sexton, R., Dorsey, R., Johnson, J.: Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing. European Journal of Operational Research 114, 589–601 (1999)
Sexton, R., Dorsey, R., Johnson, J.: Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation. Decision Support Systems 22, 171–185 (1998)
Sexton, R., Dorsey, R., Johnson, J.: Beyond Backpropagation: Using Simulated Annealing for Training Neural Networks. Journal of End User Computing 11, 3 (1999)
Shang, Y., Wah, B.W.: Global optimization for neural network training. Computer 29, 45(10) (1996)
PintÈr, J.: Global optimization in action: continuous and Lipschitz optimization–algorithms, implementations, and applications. Kluwer Academic Publishers, Dordrecht (1996)
Trn, A., Zhilinskas, A.: Global optimization. Springer, Heidelberg (1989)
Zhang, X.M., Chen, Y.Q.: Ray-guided global optimization method for training neural networks. Neurocomputing 30, 333–337 (2000)
Zhang, X.-S.: Neural networks in optimization. Kluwer Academic Publishers, Boston (2000)
Rubinov, A.M.: Abstract convexity and global optimization. Kluwer Academic Publishers, Dordrecht (2000)
Andramonov, M., Rubinov, A., Glover, B.: Cutting angle methods in global optimization. Applied Mathematics Letters 12, 95–100 (1999)
Bagirov, A., Rubinov, A.: Global minimization of increasing positively homogeneous function over the unit simplex. Annals of Operations Research 98, 171–187 (2000)
Petridis, V., Kazarlis, S., Papaikonomu, A., Filelis, A.: A hybrid genetic algorithm for training neural network. Artificial Neural Networks 2, 953–956 (1992)
Rechenberg, I.: Cybernatic solution path of an experimental problem. Royal Aircraft Establishment, Library translation no. 1122, Farnborough, Hants, U.K. (August 1965)
Whitley, D., Starkweather, T., BoEArt, C.: Genetic algorithms and neural networks - optimizing connections and connectivity. Parallel Computing 14, 347–361 (1990)
Montana, D., Davis, L.: Training feedforward neural networks using genetic algorithms. In: Proceedings of the Eleventh International Joint Conference on Artificial Intelligence IJCAI 1989, vol. 1 (1989)
Frean, M.: The upstart algorithm: a method for constructing and training feedforward neural networks. Neural computation 2 (1990)
Roy, A., Kim, L.S., Mukhopadhyay, S.: A polynomial time algorithm for the construction and training of a class of multiplayer perceptrons. Neural networks 6 (1993)
Hedar, A.R., Fukushima, M.: Hybrid Simulated Annealing and Direct Search method for nonlinear unconstrained global optimization. Optimization Methods and Software 17(5), 891–912 (2002)
Brooks, D.G., Verdini, W.A.: Computational experience with generalized simulated annealing over continuous variables. American Journal of Mathematical and Management Sciences 8, 425–449 (1988)
Cardoso, M.F., Salcedo, R.L., de Azevedo, S.F.: The simplex-simulated annealing approach to continuous non-linear optimization. Journal of Computers and Chemical Engineering 20, 1065–1080 (1996)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ghosh, R., Ghosh, M., Yearwood, J., Bagirov, A. (2005). Comparative Analysis of Genetic Algorithm, Simulated Annealing and Cutting Angle Method for Artificial Neural Networks. In: Perner, P., Imiya, A. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2005. Lecture Notes in Computer Science(), vol 3587. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11510888_7
Download citation
DOI: https://doi.org/10.1007/11510888_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26923-6
Online ISBN: 978-3-540-31891-0
eBook Packages: Computer ScienceComputer Science (R0)