Skip to main content

Theory and Applications of Hybrid Simulated Annealing

  • Chapter
Handbook of Optimization

Part of the book series: Intelligent Systems Reference Library ((ISRL,volume 38))

  • 5817 Accesses

Abstract

Local optimization techniques such as gradient-based methods and the expectation-maximization algorithm have an advantage of fast convergence but do not guarantee convergence to the global optimum. On the other hand, global optimization techniques based on stochastic approaches such as evolutionary algorithms and simulated annealing provide the possibility of global convergence, which is accomplished at the expense of computational and time complexity. This chapter aims at demonstrating how these two approaches can be effectively combined for improved convergence speed and quality of the solution. In particular, a hybrid method, called hybrid simulated annealing (HSA), is presented, where a simulated annealing algorithm is combined with local optimization methods. First, its general procedure and mathematical convergence properties are described. Then, its two example applications are presented, namely, optimization of hidden Markov models for visual speech recognition and optimization of radial basis function networks for pattern classification, in order to show how the HSA algorithm can be successfully adopted for solving real-world problems effectively. As an appendix, the source code for multi-dimensional Cauchy random number generation is provided, which is essential for implementation of the presented method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bahl, L.R., Brown, P.F., de Souza, P.V., Mercer, R.L.: Maximum mutual information estimation of hidden Markov model parameters for speech recognition. In: Proc. Int. Conf. Acoustics, Speech and Signal Processing, Tokyo, Japan, pp. 49–52 (1986)

    Google Scholar 

  2. Ben-Yishai, A., Burshtein, D.: A discriminative training algorithm for hidden Markov models. IEEE Trans. Speech Audio Processing 12(3), 204–216 (2004)

    Article  Google Scholar 

  3. Benoudjit, N., Archambeau, C., Lendasse, A., Lee, M.V.J.A.: Width optimization of the Gaussian kernels in radial basis function networks. In: Proc. European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 425–432 (2002)

    Google Scholar 

  4. Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford Univ. Press, Inc., New York (1995)

    Google Scholar 

  5. Blake, C.L., Merz, C.J.: UCI repository of machine learning database. Dept. Information and Computer Science. Univ. California (1998)

    Google Scholar 

  6. Chen, S., Cowan, C.F.N., Grant, P.M.: Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans. Neural Networks 2(2), 302–309 (1991)

    Article  Google Scholar 

  7. Chen, T., Chen, H.: Approximation capability to functions of several variables, nonlinear functionals, and operators by radial basis function networks. IEEE Trans. Neural Networks 6, 904–910 (1995)

    Article  Google Scholar 

  8. Chibelushi, C.C., Deravi, F., Mason, J.S.D.: A review of speech-based bimodal recognition. IEEE Trans. Multimedia 4(1), 23–37 (2002)

    Article  Google Scholar 

  9. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. R. Statist. Soc., Ser. B 39(1), 1–38 (1977)

    MathSciNet  MATH  Google Scholar 

  10. Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Machine Intell. 6(6), 721–741 (1984)

    Article  MATH  Google Scholar 

  11. Ingber, L., Rosen, B.: Genetic algorithms and very fast simulated reannealing: a comparison. Mathematical and Computer Modeling 16(11), 87–100 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  12. Juang, B.H., Chou, W., Lee, C.H.: Minimum classification error rate methods for speech recognition. IEEE Trans. Speech Audio Processing 5(3), 257–265 (1997)

    Article  Google Scholar 

  13. Kirkpatrick, S., Gerlatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220, 671–680 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  14. Laarhoven, P.J.M.V., Aarts, E.H.L.: Simulated Annealing: Theory and Applications. Kluwer Academic Publishers, Dordrecht (1987)

    MATH  Google Scholar 

  15. Lee, J.S., Park, C.H.: Robust audio-visual speech recognition based on late integration. IEEE Trans. Multimedia 10(5), 767–779 (2008)

    Article  Google Scholar 

  16. Lee, J.S., Park, C.H.: Global optimization of radial basis function networks by hybrid simulated annealing. Neural Network World 20(4), 519–537 (2010a)

    MATH  Google Scholar 

  17. Lee, J.S., Park, C.H.: Hybrid simulated annealig and its application to optimization of hidden Markov models for visual speech recognition. IEEE Trans. Syst., Man, Cybern. B 40(4), 1188–1196 (2010b)

    Article  Google Scholar 

  18. Marsaglia, G.: Choosing a point from the surface of a sphere. Ann. Math. Statist. 43(2), 645–646 (1972)

    Article  MATH  Google Scholar 

  19. Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., Teller, E.: Equation of state calculations by fast computing machines. J. Chem. Phys. 21(6), 1087–1092 (1953)

    Article  Google Scholar 

  20. Mitra, D., Romeo, F., Sangiovanni-Vincentelli, A.: Convergence and finite-time behavior of simulated annealing. Advances in Applied Probability 18, 747–771 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  21. Moody, J., Darken, C.: Fast learning in networks of locally-tuned processing units. Neural Computation 1, 291–294 (1989)

    Article  Google Scholar 

  22. Nam, D., Lee, J.S., Park, C.H.: n-dimensional Cauchy neighbor generation for the fast simulated annealing. IEICE Trans. Inf. Syst. E87-D(11), 2499–2502 (2004)

    Google Scholar 

  23. Orr, M.J.L.: Regularisation in the selection of radial basis function centers. Neural Computation 7, 606–623 (1995)

    Article  Google Scholar 

  24. Orr, M.J.L.: Introduction to radial basis function networks. Tech. rep., Center for Cognitive Science, Univ. Edinburgh (1996)

    Google Scholar 

  25. Park, J., Sandberg, I.W.: Approximation and radial basis function networks. Neural Computation 5, 305–316 (1993)

    Article  Google Scholar 

  26. Paul, D.: Training of HMM recognizers by simulated annealing. In: Proc. Int. Conf. Acoustics, Speech and Signal Processing, Tampa, FL, pp. 13–16 (1985)

    Google Scholar 

  27. Rabiner, L., Juang, B.H.: Fundamentals of Speech Recognition. Prentice Hall, Englewood Cliffs (1993)

    Google Scholar 

  28. Rao, A.V., Rose, K.: Deterministically annealed design of hidden Markov model speech recognizers. IEEE Trans. Speech Audio Processing 9(2), 111–126 (2001)

    Article  Google Scholar 

  29. Rodrígues, L.J., Torres, I.: Comparative Study of the Baum-Welch and Viterbi Training Algorithms Applied to Read and Spontaneous Speech Recognition. In: Perales, F.J., Campilho, A.C., Pérez, N., Sanfeliu, A. (eds.) IbPRIA 2003. LNCS, vol. 2652, pp. 847–857. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  30. Schwenker, F., Kestler, H.A., Palm, G.: Three learning phases for radial-basis-function networks. Neural Networks 14, 439–458 (2001)

    Article  Google Scholar 

  31. Szu, H.H., Hartley, R.L.: Fast simulated annealing. Phys. Lett. A 122(3-4), 157–162 (1987)

    Article  Google Scholar 

  32. Yang, R.L.: Convergence of the simulated annealing algorithm for continuous global optimization. J. Optimization Theory and Applications 104(3), 691–716 (2000)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jong-Seok Lee .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Lee, JS., Park, C.H., Ebrahimi, T. (2013). Theory and Applications of Hybrid Simulated Annealing. In: Zelinka, I., Snášel, V., Abraham, A. (eds) Handbook of Optimization. Intelligent Systems Reference Library, vol 38. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30504-7_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-30504-7_16

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-30503-0

  • Online ISBN: 978-3-642-30504-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics