Abstract
This paper presents an algorithm for minimizing a function of one variable which uses function, but not derivative, values at five-points to generate each iterate. It employs quadratic and polyhedral approximations together with a safeguard. The basic method without the safeguard exhibits a type of better than linear convergence for certain piecewise twice continuously differentiable functions. The safeguard guarantees convergence to a stationary point for very general functions and preserves the better than linear convergence of the basic method.
Similar content being viewed by others
References
J. Barzilai and A. Ben-Tal, “Nonpolynomial and inverse interpolation for line search: Synthesis and convergence rates,”SIAM Journal on Numerical Analysis 19 (1982) 1263–1277.
R.P. Brent,Algorithms for Minimization Without Derivatives (Prentice-Hall, Englewood Cliffs, NJ, 1973).
B. Dijkhuis, “An adaptive algorithm for minimizing a unimodal function of 1 variable,”Zeitschrift für Angewandte Mathematik und Mechanik 51 (1971) 45–46.
P.E. Gill and W. Murray, “Safeguarded steplength algorithms for optimization using descent methods,” National Physical Laboratory Report NAC 37 (Teddington, 1974).
C. Lemarechal and R. Mifflin, “Global and superlinear convergence of an algorithm for one-dimensional minimization of convex functions,”Mathematical Programming 24 (1982) 241–256.
R. Mifflin, “Stationarity and superlinear convergence of an algorithm for univariate locally Lipschitz constrained minimization,”Mathematical Programming 28 (1984) 50–71.
R. Mifflin, “Better than linear convergence and safeguarding in nonsmooth minimization,” in: P. Thoft-Christensen, ed.,System Modelling and Optimization (Springer, Berlin, 1984) pp. 321–330.
R. Mifflin and J.-J. Strodiot, “A bracketing technique to ensure desirable convergence in univariate minimization,”Mathematical Programming 43 (1989) 117–130.
S.M. Robinson, “Quadratic interpolation is risky,”SIAM Journal on Numerical Analysis 16 (1979) 377–379.
R.T. Rockafellar,The Theory of Subgradients and its Application to Problems of Optimization. Convex and Nonconvex Functions (Heldermann Verlag, Berlin, 1981).
A. Tamir, “Line search techniques based on interpolating polynomials using function values only,”Management Science 22 (1976) 576–586.
M.A. Wolfe,Numerical Methods for Unconstrained Optimization (Van Nostrand Reinhold, New York, 1978).
Author information
Authors and Affiliations
Additional information
This paper is dedicated to Phil Wolfe on the occasion of his 65th birthday.
Research sponsored by the Institut National de Recherche en Informatique et Automatique, Rocquencourt, France, and by the Air Force Office of Scientific Research, Air Force System Command, USAF, under Grant Number AFOSR-83-0210.
Research sponsored, in part, by the Institut National de Recherche en Informatique et Automatique, Rocquencourt, France.
Rights and permissions
About this article
Cite this article
Mifflin, R., Strodiot, J.J. A rapidly convergent five-point algorithm for univariate minimization. Mathematical Programming 62, 299–319 (1993). https://doi.org/10.1007/BF01585172
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF01585172