Abstract
An new inexact method of tangent hyperbolas (NIMTH) has been proposed recently. In NIMTH, the Newton equation and the Newton-like equation are solved respectively by one Cholesky factorization (CF) step and p preconditioned conjugate gradient (PCG) steps, periodically. The algorithm is efficient in theory. But its implementation is still restricted. In this paper, an efficient version of NIMTH is presented, in which the parameter p is independent of the complexity of the objective function, and its tensor terms can be efficiently evaluated by automatic differentiation. Further theoretical analysis and numerical experiments show that this version of NIMTH is of great competition for the middle and large scale unconstrained optimization problems.
The work was supported by the Mathematics and Physics Foundation of Beijing University of Technology (Grant No.Kz0603200381) and the National Science Foundation of China (Grant No.60503031).
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Kalaba, R., Tischler, A.: A Generalized Newton Algorithm Using High Order Derivatives. Journal of Optimization Theory and Applications 39, 1–17 (1983)
Jackson, R., McCormick, G.: The Poliyadic Structure of Factorable Functions Tensors with Applications to High-Order Minimization Techniques. Journal of Optimization Theory and Applications 51(1), 63–94 (1986)
Sherman, A.H.: On Newton-iterative methods for the solution of systems of nonlinear equations. SIAM Journal Numerical Analysis 15, 755–771 (1978)
Toint, P.L.: Towards an Efficient Sparsity Exploiting Newton Method for Minimization. Sparse Matrices and Their Uses (I.S. Duff), pp. 57–88. Academic Press, London, England (1981)
Deng, N.Y., Zhang, H.B.: Theoretical Efficiency of a New Inexact Method of Tangent Hyperbolas. Optimization Methods and Software 19(3–4), 247–265 (2004)
Zhang, H.B., Deng, N.Y.: An improved inexact Newton method. Journal of Global Optimization doi:10.1007/s10898-007-9134-4 online first (to appear)
Griewank, A.: Evaluating Derivatives Principles and Techniques of Algorithmic Differentiation, SIAM, Philadephia. Frontiers in Applied Mathematics, vol. 19 (2000)
Moré, J., Garbow, B., Hillstrom, K.: Testing unconstrained optimization software. ACM Transactions on Mathematical Software 7, 17–41 (1981)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, H., Cheng, Q., Xue, Y., Deng, N. (2007). An Efficient Version on a New Improved Method of Tangent Hyperbolas. In: Li, K., Fei, M., Irwin, G.W., Ma, S. (eds) Bio-Inspired Computational Intelligence and Applications. LSMS 2007. Lecture Notes in Computer Science, vol 4688. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74769-7_23
Download citation
DOI: https://doi.org/10.1007/978-3-540-74769-7_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74768-0
Online ISBN: 978-3-540-74769-7
eBook Packages: Computer ScienceComputer Science (R0)