Abstract
Artificial neural networks are used to solve problems that are difficult for humans and computers. Unfortunately, artificial neural network training is time consuming and, because it is a random process, several cold starts are recommended. Neural network training is typically a two step process. First, the network’s weights are initialized using a no greedy method to elude local minima. Second, an optimization method (i.e., conjugate gradient learning) is used to quickly find the nearest local minimum. In general, training must be performed to reduce the mean square error computed between the desired output and the actual network output. One common method for network initialization is simulated annealing; it is used to assign good starting values to the network’s weights before performing the optimization. The performance of simulated annealing depends strongly on the cooling process. A cooling schedule based on temperature cycling is proposed to improve artificial neural network training. It is shown that temperature cycling reduces training time while decreasing the mean square error on auto-associative neural networks. Three auto-associative problems: The Trifolium, The Cardioid, and The Lemniscate of Bernoulli, are solved using exponential cooling, linear cooling and temperature cycling to verify our results.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Abramson, D., Dang, H., Krishnamoorthy, M.: Simulated Annealing Cooling Schedules for the School Timetabling Problem. Asia-Pacific Journal of Operational Research , 1–22 (1999)
Huang, M., Romeo, F., Sangiovanni-Vincentelli, A.: An Efficient General Cooling Schedule for Simulated Annealing. In: ICCAD. Proc. of the IEEE International Conf. on Computer Aided Design, pp. 381–384 (1986)
Jones, M.T.: AI Application Programming, Charles River Media, 2nd edn. pp. 49–67 (2005)
Johnson, D., McGeoch, L.: The Traveling Salesman Problem: A Case Study in Local Optimization. In: Aarts, E.H., Lenstra, J.K. (eds.) Local Search in Combinatorial Optimization, Wiley and Sons, Chichester
Luke, B.T.: Simulated Annealing Cooling Schedules, (June 1, 2007) available online at http://members.aol.com/btluke/simanf1.htm
Masters, T.: Practical Neural Network Recipes in C++, pp. 118–134. Academic Press, London (1993)
Masters, T.: Advanced Algorithms for Neural Networks, pp. 135–156. John Wiley & Sons Inc, Chichester (1995)
Metropolis, N., Rosenbluth, A., Rosenbluth, M., Teller, E.: Journal of Chemical Physics 21, 1087–1092 (1953)
Möbius, A., Neklioudov, A., Díaz-Sánchez, A., Hoffmann, K.H., Fachat, A., Schreiber, M.: Optimization by Thermal Cycling. Physical Review 79(22) (1997)
Nilsson, N.J.: Artificial Intelligence: A New Synthesis, pp. 37–58. Morgan Kaufmann Publishers, San Francisco (1998)
Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes in C++: The Art of Scientific Computing, 2nd edn. pp. 448–459. Cambridge University Press, Cambridge (2002)
Reed, R.D., Marks II, R.J.: Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks, pp. 97–112. The MIT Press, Cambridge (1999)
Russel, S.J., Norvig, P.: Artificial Intelligence: A Modern Approach, 2nd edn. Prentice Hall, Englewood Cliffs (2002)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ledesma, S., Torres, M., Hernández, D., Aviña, G., García, G. (2007). Temperature Cycling on Simulated Annealing for Neural Network Learning. In: Gelbukh, A., Kuri Morales, Á.F. (eds) MICAI 2007: Advances in Artificial Intelligence. MICAI 2007. Lecture Notes in Computer Science(), vol 4827. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76631-5_16
Download citation
DOI: https://doi.org/10.1007/978-3-540-76631-5_16
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-76630-8
Online ISBN: 978-3-540-76631-5
eBook Packages: Computer ScienceComputer Science (R0)