Abstract
We consider the Spectral Projected Gradient method for solving constrained optimization problems with the objective function in the form of mathematical expectation. It is assumed that the feasible set is convex, closed and easy to project on. The objective function is approximated by a sequence of different Sample Average Approximation functions with different sample sizes. The sample size update is based on two error estimates—SAA error and approximate solution error. The Spectral Projected Gradient method combined with a nonmonotone line search is used. The almost sure convergence results are achieved without imposing explicit sample growth condition. Preliminary numerical results show the efficiency of the proposed method.
Similar content being viewed by others
References
Andradottir, S.: A scaled stochastic approximation algorithm. Manage. Sci. 42(4), 475–498 (1996)
Bastin, F.: Trust-Region Algorithms for Nonlinear Stochastic Programming and Mixed Logit Models, Ph.D. thesis, University of Namur, Belgium (2004)
Bastin, F., Cirillo, C., Toint, P.L.: An adaptive Monte Carlo algorithm for computing mixed logit estimators. CMS 3(1), 55–79 (2006)
Bastin, F., Cirillo, C., Toint, P.L.: Convergence theory for nonconvex stochastic programming with an application to mixed logit. Math. Program. Ser. B 108, 207–234 (2006)
Bayraksan, G., Morton, D.P.: A sequential sampling procedure for stochastic programming. Oper. Res. Int. J. 59, 898–913 (2011)
Homem-de-Mello, T., Bayraksan, G.: Monte Carlo sampling-based methods for stochastic optimization. Surv. Oper. Res. Manag. Sci. 19, 56–85 (2014)
Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone Spectral Projected Gradients on Convex Sets. SIAM J. Optim. 10, 1196–1211 (2000)
Birgin, E.G., Martínez, J.M., Raydan, M.: Spectral projected gradient methods: review and perspectives. J. Stat. Softw. 60(3), 1–21 (2014)
Byrd, R., Chin, G., Neveitt, W., Nocedal, J.: On the use of stochastic Hessian information in optimization methods for machine learning. SIAM J. Optim. 21(3), 977–995 (2011)
Byrd, R., Chin, G., Neveitt, W., Nocedal, J.: Sample size selection in optimization methods for machine learning. Math. Program. 134(1), 127–155 (2012)
Byrd, R.H., Hansen, S.L., Nocedal, J., Singer, Y.: A stochastic quasi-Newton method for large scale optimization, arxiv.org/abs/1401.7020)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. Ser. A 91, 201–213 (2002)
Friedlander, M.P., Schmidt, M.: Hybrid deterministic-stochastic methods for data fitting. SIAM J. Sci. Comput. 34(3), 1380–1405 (2012)
Fu, M.C.: Gradient estimation, In: Henderson, S.G., Nelson, B.L. (eds.), Handbook in OR & MS 13, pp. 575–616. Elsevier B.V. (2006)
Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)
Grippo, L., Lampariello, F., Lucidi, S.: A class of nonmonotone stabilization methods in unconstrained optimization. Numer. Math. 59, 779–805 (1991)
Homem-de-Mello, T.: Variable-sample methods for stochastic optimization. ACM Trans. Model. Comput. Simul. 13(2), 108–133 (2003)
Kao, C., Song, W.T., Chen, S.: A modified quasi-Newton method for optimization in simulation. Int. Trans. Oper. Res. 4(3), 223–233 (1997)
Krejić, N., Lužanin, Z., Ovcin, Z., Stojkovska, I.: Descent direction method with line search for unconstrained optimization in noisy environment. Optim. Methods Softw. 30(6), 1164–1184 (2015)
Krejić, N., Krklec, N.: Line search methods with variable sample size for unconstrained optimization. J. Comput. Appl. Math. 245, 213–231 (2013)
Krejić, N., Jerinkić, N.Krklec: Nonmonotone line search methods with variable sample size. Numer. Algorithms 68, 711–739 (2015)
Krejić, N., Martínez, J.M.: Inexact restoration approach for minimization with inexact evaluation of the objective function. Math. Comput. 85(300), 1775–1791 (2016)
Law, A.M.: Simulation Modeling and Analysis. McGraw-Hill Education, New York (2014)
Mokhtary, A., Ribeiro, A.: Global convergence of online limited memory BFGS. J. Mach. Learn. Res. 16, 3151–3181 (2015)
Ali, M.Montaz, Khompatraporn, C., Zabinsky, Z.B.: A numerical evaluation of several stochastic algorithms on selected continous global optimization test problems. J. Global Optim. 31(4), 635–672 (2005)
Li, D.H., Fukushima, M.: A derivative-free line search and global convergence of Broyden-like method for nonlinear equations. Optim. Methods Softw. 13, 181–201 (2000)
Pasupathy, R.: On choosing parameters in retrospective-approximation algorithms for stochastic root finding and simulation optimization. Oper. Res. 58(4), 889–901 (2010)
Polak, E., Royset, J.O.: Eficient sample sizes in stochastic nonlinear programing. J. Comput. Appl. Math. 217(2), 301–310 (2008)
Royset, J.O.: Optimality functions in stochastic programming. Math. Program. 135(1–2), 293–321 (2012)
Shapiro, A., Dentcheva, D., Ruszczynski, A.: Lectures on stochastic programming: modeling and theory. MPS/SIAM Ser. Optim. 9, 1–413 (2009)
Shapiro, A., Wardi, Y.: Convergence analysis of gradient descent stochastic algorithms. J. Optim. Theory Appl. 91(2), 439–454 (1996)
Spall, J.C.: Introduction to Stochastic Search and Optimization. Wiley, Hoboken (2003)
Wardi, Y.: Stochastic algorithms with Armijo stepsizes for minimization of functions. J. Optim. Theory Appl. 64, 399–417 (1990)
Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 4, 1043–1056 (2004)
Yan, D., Mukai, H.: Optimization algorithm with probabilistic estimation. J. Optim. Theory Appl. 79(2), 345–371 (1993)
Acknowledgements
We are grateful to the associate editor and two anonymous referees whose constructive remarks helped us to improve this paper.
Author information
Authors and Affiliations
Corresponding author
Additional information
Research supported by Serbian Ministry of Education, Science and Technological Development, Grant No. 174030.
Rights and permissions
About this article
Cite this article
Krejić, N., Krklec Jerinkić, N. Spectral projected gradient method for stochastic optimization. J Glob Optim 73, 59–81 (2019). https://doi.org/10.1007/s10898-018-0682-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10898-018-0682-6