Abstract
A parallel Newton method is described for the minimization of a twice continuously differentiable uniformly convex functionF(x). The algorithm generates a sequence {x j } which converges superlinearly to the global minimizer ofF(x).
Similar content being viewed by others
References
H. Fischer, “Automatic differentiation: How to compute the Hessian matrix,” Technical Report # 104, Inst. für Angewandte Mathematik und Statistik, Tech. Univ. München, 1987.
H. Fischer, “Some aspects of automatic differentiation,” Technical Report # 107, Inst. für Angewandte Mathematik und Statistik, Tech. Univ. München, 1987.
H. Mukai, “Parallel algorithms for solving systems of nonlinear equations,” Proceedings of the 17th Annual Allerton Conference on Communications, Control and Computations (Oct. 10–12, 1979), 37–46.
L.B. Rall, “Automatic differentiation: Techniques and applications,” Lecture Notes in Computer Science No. 120 (Springer-Verlag, Berlin, Heidelberg, New York, 1981).
L.B. Rall, “The arithmetic of differentiation,”Mathematics Magazine 59 (1986) 275–282.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Fischer, H., Ritter, K. An asynchronous parallel newton method. Mathematical Programming 42, 363–374 (1988). https://doi.org/10.1007/BF01589411
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF01589411