Abstract
Local optimization techniques such as gradient-based methods and the expectation-maximization algorithm have an advantage of fast convergence but do not guarantee convergence to the global optimum. On the other hand, global optimization techniques based on stochastic approaches such as evolutionary algorithms and simulated annealing provide the possibility of global convergence, which is accomplished at the expense of computational and time complexity. This chapter aims at demonstrating how these two approaches can be effectively combined for improved convergence speed and quality of the solution. In particular, a hybrid method, called hybrid simulated annealing (HSA), is presented, where a simulated annealing algorithm is combined with local optimization methods. First, its general procedure and mathematical convergence properties are described. Then, its two example applications are presented, namely, optimization of hidden Markov models for visual speech recognition and optimization of radial basis function networks for pattern classification, in order to show how the HSA algorithm can be successfully adopted for solving real-world problems effectively. As an appendix, the source code for multi-dimensional Cauchy random number generation is provided, which is essential for implementation of the presented method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bahl, L.R., Brown, P.F., de Souza, P.V., Mercer, R.L.: Maximum mutual information estimation of hidden Markov model parameters for speech recognition. In: Proc. Int. Conf. Acoustics, Speech and Signal Processing, Tokyo, Japan, pp. 49–52 (1986)
Ben-Yishai, A., Burshtein, D.: A discriminative training algorithm for hidden Markov models. IEEE Trans. Speech Audio Processing 12(3), 204–216 (2004)
Benoudjit, N., Archambeau, C., Lendasse, A., Lee, M.V.J.A.: Width optimization of the Gaussian kernels in radial basis function networks. In: Proc. European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 425–432 (2002)
Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford Univ. Press, Inc., New York (1995)
Blake, C.L., Merz, C.J.: UCI repository of machine learning database. Dept. Information and Computer Science. Univ. California (1998)
Chen, S., Cowan, C.F.N., Grant, P.M.: Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans. Neural Networks 2(2), 302–309 (1991)
Chen, T., Chen, H.: Approximation capability to functions of several variables, nonlinear functionals, and operators by radial basis function networks. IEEE Trans. Neural Networks 6, 904–910 (1995)
Chibelushi, C.C., Deravi, F., Mason, J.S.D.: A review of speech-based bimodal recognition. IEEE Trans. Multimedia 4(1), 23–37 (2002)
Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. R. Statist. Soc., Ser. B 39(1), 1–38 (1977)
Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Machine Intell. 6(6), 721–741 (1984)
Ingber, L., Rosen, B.: Genetic algorithms and very fast simulated reannealing: a comparison. Mathematical and Computer Modeling 16(11), 87–100 (1992)
Juang, B.H., Chou, W., Lee, C.H.: Minimum classification error rate methods for speech recognition. IEEE Trans. Speech Audio Processing 5(3), 257–265 (1997)
Kirkpatrick, S., Gerlatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220, 671–680 (1983)
Laarhoven, P.J.M.V., Aarts, E.H.L.: Simulated Annealing: Theory and Applications. Kluwer Academic Publishers, Dordrecht (1987)
Lee, J.S., Park, C.H.: Robust audio-visual speech recognition based on late integration. IEEE Trans. Multimedia 10(5), 767–779 (2008)
Lee, J.S., Park, C.H.: Global optimization of radial basis function networks by hybrid simulated annealing. Neural Network World 20(4), 519–537 (2010a)
Lee, J.S., Park, C.H.: Hybrid simulated annealig and its application to optimization of hidden Markov models for visual speech recognition. IEEE Trans. Syst., Man, Cybern. B 40(4), 1188–1196 (2010b)
Marsaglia, G.: Choosing a point from the surface of a sphere. Ann. Math. Statist. 43(2), 645–646 (1972)
Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., Teller, E.: Equation of state calculations by fast computing machines. J. Chem. Phys. 21(6), 1087–1092 (1953)
Mitra, D., Romeo, F., Sangiovanni-Vincentelli, A.: Convergence and finite-time behavior of simulated annealing. Advances in Applied Probability 18, 747–771 (1986)
Moody, J., Darken, C.: Fast learning in networks of locally-tuned processing units. Neural Computation 1, 291–294 (1989)
Nam, D., Lee, J.S., Park, C.H.: n-dimensional Cauchy neighbor generation for the fast simulated annealing. IEICE Trans. Inf. Syst. E87-D(11), 2499–2502 (2004)
Orr, M.J.L.: Regularisation in the selection of radial basis function centers. Neural Computation 7, 606–623 (1995)
Orr, M.J.L.: Introduction to radial basis function networks. Tech. rep., Center for Cognitive Science, Univ. Edinburgh (1996)
Park, J., Sandberg, I.W.: Approximation and radial basis function networks. Neural Computation 5, 305–316 (1993)
Paul, D.: Training of HMM recognizers by simulated annealing. In: Proc. Int. Conf. Acoustics, Speech and Signal Processing, Tampa, FL, pp. 13–16 (1985)
Rabiner, L., Juang, B.H.: Fundamentals of Speech Recognition. Prentice Hall, Englewood Cliffs (1993)
Rao, A.V., Rose, K.: Deterministically annealed design of hidden Markov model speech recognizers. IEEE Trans. Speech Audio Processing 9(2), 111–126 (2001)
RodrÃgues, L.J., Torres, I.: Comparative Study of the Baum-Welch and Viterbi Training Algorithms Applied to Read and Spontaneous Speech Recognition. In: Perales, F.J., Campilho, A.C., Pérez, N., Sanfeliu, A. (eds.) IbPRIA 2003. LNCS, vol. 2652, pp. 847–857. Springer, Heidelberg (2003)
Schwenker, F., Kestler, H.A., Palm, G.: Three learning phases for radial-basis-function networks. Neural Networks 14, 439–458 (2001)
Szu, H.H., Hartley, R.L.: Fast simulated annealing. Phys. Lett. A 122(3-4), 157–162 (1987)
Yang, R.L.: Convergence of the simulated annealing algorithm for continuous global optimization. J. Optimization Theory and Applications 104(3), 691–716 (2000)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Lee, JS., Park, C.H., Ebrahimi, T. (2013). Theory and Applications of Hybrid Simulated Annealing. In: Zelinka, I., Snášel, V., Abraham, A. (eds) Handbook of Optimization. Intelligent Systems Reference Library, vol 38. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30504-7_16
Download citation
DOI: https://doi.org/10.1007/978-3-642-30504-7_16
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-30503-0
Online ISBN: 978-3-642-30504-7
eBook Packages: EngineeringEngineering (R0)