Skip to main content

Temperature Cycling on Simulated Annealing for Neural Network Learning

  • Conference paper
MICAI 2007: Advances in Artificial Intelligence (MICAI 2007)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4827))

Included in the following conference series:

  • 1099 Accesses

Abstract

Artificial neural networks are used to solve problems that are difficult for humans and computers. Unfortunately, artificial neural network training is time consuming and, because it is a random process, several cold starts are recommended. Neural network training is typically a two step process. First, the network’s weights are initialized using a no greedy method to elude local minima. Second, an optimization method (i.e., conjugate gradient learning) is used to quickly find the nearest local minimum. In general, training must be performed to reduce the mean square error computed between the desired output and the actual network output. One common method for network initialization is simulated annealing; it is used to assign good starting values to the network’s weights before performing the optimization. The performance of simulated annealing depends strongly on the cooling process. A cooling schedule based on temperature cycling is proposed to improve artificial neural network training. It is shown that temperature cycling reduces training time while decreasing the mean square error on auto-associative neural networks. Three auto-associative problems: The Trifolium, The Cardioid, and The Lemniscate of Bernoulli, are solved using exponential cooling, linear cooling and temperature cycling to verify our results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Abramson, D., Dang, H., Krishnamoorthy, M.: Simulated Annealing Cooling Schedules for the School Timetabling Problem. Asia-Pacific Journal of Operational Research , 1–22 (1999)

    Google Scholar 

  2. Huang, M., Romeo, F., Sangiovanni-Vincentelli, A.: An Efficient General Cooling Schedule for Simulated Annealing. In: ICCAD. Proc. of the IEEE International Conf. on Computer Aided Design, pp. 381–384 (1986)

    Google Scholar 

  3. Jones, M.T.: AI Application Programming, Charles River Media, 2nd edn. pp. 49–67 (2005)

    Google Scholar 

  4. Johnson, D., McGeoch, L.: The Traveling Salesman Problem: A Case Study in Local Optimization. In: Aarts, E.H., Lenstra, J.K. (eds.) Local Search in Combinatorial Optimization, Wiley and Sons, Chichester

    Google Scholar 

  5. Luke, B.T.: Simulated Annealing Cooling Schedules, (June 1, 2007) available online at http://members.aol.com/btluke/simanf1.htm

  6. Masters, T.: Practical Neural Network Recipes in C++, pp. 118–134. Academic Press, London (1993)

    Google Scholar 

  7. Masters, T.: Advanced Algorithms for Neural Networks, pp. 135–156. John Wiley & Sons Inc, Chichester (1995)

    Google Scholar 

  8. Metropolis, N., Rosenbluth, A., Rosenbluth, M., Teller, E.: Journal of Chemical Physics  21, 1087–1092 (1953)

    Google Scholar 

  9. Möbius, A., Neklioudov, A., Díaz-Sánchez, A., Hoffmann, K.H., Fachat, A., Schreiber, M.: Optimization by Thermal Cycling. Physical Review 79(22) (1997)

    Google Scholar 

  10. Nilsson, N.J.: Artificial Intelligence: A New Synthesis, pp. 37–58. Morgan Kaufmann Publishers, San Francisco (1998)

    MATH  Google Scholar 

  11. Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes in C++: The Art of Scientific Computing, 2nd edn. pp. 448–459. Cambridge University Press, Cambridge (2002)

    Google Scholar 

  12. Reed, R.D., Marks II, R.J.: Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks, pp. 97–112. The MIT Press, Cambridge (1999)

    Google Scholar 

  13. Russel, S.J., Norvig, P.: Artificial Intelligence: A Modern Approach, 2nd edn. Prentice Hall, Englewood Cliffs (2002)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Alexander Gelbukh Ángel Fernando Kuri Morales

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ledesma, S., Torres, M., Hernández, D., Aviña, G., García, G. (2007). Temperature Cycling on Simulated Annealing for Neural Network Learning. In: Gelbukh, A., Kuri Morales, Á.F. (eds) MICAI 2007: Advances in Artificial Intelligence. MICAI 2007. Lecture Notes in Computer Science(), vol 4827. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76631-5_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-76631-5_16

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-76630-8

  • Online ISBN: 978-3-540-76631-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics