skip to main content
research-article

Algorithm 896: LSA: Algorithms for large-scale optimization

Published: 23 July 2009 Publication History

Abstract

We present 14 basic Fortran subroutines for large-scale unconstrained and box constrained optimization and large-scale systems of nonlinear equations. Subroutines PLIS and PLIP, intended for dense general optimization problems, are based on limited-memory variable metric methods. Subroutine PNET, also intended for dense general optimization problems, is based on an inexact truncated Newton method. Subroutines PNED and PNEC, intended for sparse general optimization problems, are based on modifications of the discrete Newton method. Subroutines PSED and PSEC, intended for partially separable optimization problems, are based on partitioned variable metric updates. Subroutine PSEN, intended for nonsmooth partially separable optimization problems, is based on partitioned variable metric updates and on an aggregation of subgradients. Subroutines PGAD and PGAC, intended for sparse nonlinear least-squares problems, are based on modifications and corrections of the Gauss-Newton method. Subroutine PMAX, intended for minimization of a maximum value (minimax), is based on the primal line-search interior-point method. Subroutine PSUM, intended for minimization of a sum of absolute values, is based on the primal trust-region interior-point method. Subroutines PEQN and PEQL, intended for sparse systems of nonlinear equations, are based on the discrete Newton method and the inverse column-update quasi-Newton method, respectively. Besides the description of methods and codes, we propose computational experiments which demonstrate the efficiency of the proposed algorithms.

Supplementary Material

Zip (896.zip)
Software for LSA: Algorithms for large-scale optimization

References

[1]
Al-Baali, M. and Fletcher, R. 1985. Variational methods for nonlinear least squares. J. Optimiz. Theor. Appl. 36, 405--421.
[2]
Brown, P. N. and Saad, Y. 1994. Convergence theory of nonlinear Newton-Krylov algorithms. SIAM J. Optimiz. 4, 297--330.
[3]
Byrd, R. H., Nocedal, J., and Waltz, R. A. 2006. KNITRO: An integrated package for nonlinear optimization. In Large-Scale Nonlinear Optimization, G. di Pillo and M. Roma, Eds. (More information is on http://www.ziena.com/documentation.htm). Springer-Verlag, Berlin, Germany, 35--59.
[4]
Coleman, T. F. and Moré, J. J. 1983. Estimation of sparse Jacobian and graph coloring problem. SIAM J. Numer. Anal. 20, 187--209.
[5]
Coleman, T. F. and Moré, J. J. 1984. Estimation of sparse Hessian matrices and graph coloring problem. Math. Program. 28, 243--270.
[6]
Curtis, A. R., Powell, M. J. D., and Reid, J. K. 1974. On the estimation of sparse Jacobian matrices. IMA J. Appl. Math. 13, 117--119.
[7]
Dembo, R. S., Eisenstat, S. C., and Steihaug, T. 1982. Inexact Newton methods. SIAM J. Numer. Anal. 19, 400--408.
[8]
Dembo, R. S. and Steihaug, T. 1983. Truncated Newton algorithms for large-scale optimization. Math. Program. 26, 190--212.
[9]
Dennis, J. E. and Mei, H. H. W. 1975. An unconstrained optimization algorithm which uses function and gradient values. Rep. No. TR 75-246. Department of Computer Science, Cornell University, Ithaca.
[10]
Gill, P. E. and Murray, W. 1974. Newton type methods for unconstrained and linearly constrained optimization. Math. Program. 7, 311--350.
[11]
Griewank, A. and Toint, P. L. 1982. Partitioned variable metric updates for large-scale structured optimization problems. Numer. Math. 39, 119--137.
[12]
Liu, D. C. and Nocedal, J. 1989. On the limited memory BFGS method for large-scale optimization. Math. Program. 45, 503--528.
[13]
Lukšan, L. 1996. Hybrid methods for large sparse nonlinear least squares. J. Optimiz. Theor. Appl. 89, 575--595.
[14]
Lukšan, L., Matonoha, C., and Vlček, J. 2004. A shifted Steihaug-Toint method for computing a trust-region step. Rep. V-914. ICS AS CR, Prague.
[15]
Lukšan, L., Matonoha, C., and Vlček, J. 2005a. Primal interior-point method for large sparse minimax optimization. Tech. rep. V-941, ICS AS CR, Prague, Czech Republic.
[16]
Lukšan, L., Matonoha, C., and Vlček, J. 2005b. Trust-region interior point method for large sparse l1 optimization. Tech. rep. V-942. ICS AS CR, Prague, Czech Republic.
[17]
Lukšan, L. and Spedicato, E. 2000. Variable metric methods for unconstrained optimization and nonlinear least squares. J. Computat. Appl. Math. 124, 61--93.
[18]
Lukšan, L., Tůma, M., Hartman, J., Vlček, J., Ramešová, N., Šiška, M., and Matonoha, C. 2006. Interactive system for universal functional optimization (UFO). Version 2006. Tech. rep. V-977. ICS AS CR, Prague, Czech Republic.
[19]
Lukšan, L. and Vlček, J. 1998a. Computational experience with globally convergent descent methods for large sparse systems of nonlinear equations. Optimiz. Meth. Softw. 8, 201--223.
[20]
Lukšan, L. and Vlček, J. 1998b. Sparse and partially separable test problems for unconstrained and equality constrained optimization. Tech. rep. V-767. ICS AS CR, Prague, Czech Republic.
[21]
Lukšan, L. and Vlček, J. 2006. Variable metric method for minimization of partially separable nonsmooth functions. Pacific J. Optimiz. 2, (Jan.), 59--70.
[22]
Martinez, J. M. and Zambaldi, M. C. 1992. An inverse column-updating method for solving large-scale nonlinear systems of equations. Optimiz. Meth. Softw. 1, 129--140.
[23]
Moré, J. J. and Sorensen, D. C. 1983. Computing a trust region step. SIAM J. Sci. Statist. Computat. 4, 553--572.
[24]
Nocedal, J. 1980. Updating quasi-Newton matrices with limited storage. Math. Comp. 35, 773--782.
[25]
Nowak, U. and Weimann, L. 1991. A family of Newton codes for systems of highly nonlinear equations. Tech. rep. TR-91-10. Konrad-Zuse-Zentrum für Informationstechnik, Berlin, Germany.
[26]
Pernice, M. and Walker, H. F. 1998. NITSOL: A Newton iterative solver for nonlinear systems. SIAM J. Sci. Comput. 19, 302--318.
[27]
Schlick, T. and Fogelson, A. 1992. TNPACK—A truncated Newton minimization package for large-scale problems. ACM Trans. Math. Softw. 18, 46--111.
[28]
Steihaug, T. 1983. The conjugate gradient method and trust regions in large-scale optimization. SIAM J. Numer. Anal. 20, 626--637.
[29]
Toint, P. L. 1981. Towards an efficient sparsity exploiting Newton method for minimization. In Sparse Matrices and Their Uses, I. S. Duff, Ed. Academic Press, London, 57--88.
[30]
Toint, P. L. 1995a. Subroutine VE08: Harwell subroutine library. specifications. AEA Tech. 2, 1162--1174.
[31]
Toint, P. L. 1995b. Subroutine VE10: Harwell subroutine library. specifications. AEA Tech. 2, 1187--1197.
[32]
Tong, C. H. 1992. A comparative study of preconditioned Lanczos methods for nonsymmetric linear systems. Sandia rep. SAND91-8240B. Sandia National Laboratories, Livermore.
[33]
Tůma, M. 1988. A note on direct methods for approximations of sparse Hessian matrices. Aplikace Matematiky 33, 171--176.
[34]
Vlček, J. and Lukšan, L. 2001. Globally convergent variable metric method for nonconvex nondifferentiable unconstrained minimization. J. Optimiz. Theor. Appl. 111, 407--430.
[35]
Vlček, J. and Lukšan, L. 2006. Shifted limited-memory variable metric methods for large-scale unconstrained minimization. J. Computat. Appl. Math. 186, 365--390.
[36]
Zhu, C., Byrd, R. H., Lu, P., and Nocedal, J. 1997. Algorithm 778. L-BFGS-B: Fortran subroutines for large-scale bound constrained optimization. ACM Trans. Math. Softw. 23, 550--560.

Cited By

View all
  • (2021)Two limited-memory optimization methods with minimum violation of the previous secant conditionsComputational Optimization and Applications10.1007/s10589-021-00318-yOnline publication date: 12-Sep-2021
  • (2020) A Consistent Scheme for Gradient-Based Optimization of Protein – Ligand Poses Journal of Chemical Information and Modeling10.1021/acs.jcim.0c0109560:12(6502-6522)Online publication date: 1-Dec-2020
  • (2019)A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directionsJournal of Computational and Applied Mathematics10.1016/j.cam.2018.10.054351(14-28)Online publication date: May-2019
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Mathematical Software
ACM Transactions on Mathematical Software  Volume 36, Issue 3
July 2009
122 pages
ISSN:0098-3500
EISSN:1557-7295
DOI:10.1145/1527286
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 July 2009
Accepted: 01 January 2009
Revised: 01 August 2008
Received: 01 December 2007
Published in TOMS Volume 36, Issue 3

Permissions

Request permissions for this article.

Check for updates

Badges

Author Tags

  1. Large-scale optimization
  2. discrete Newton methods
  3. large-scale nonlinear least squares
  4. large-scale nonlinear minimax
  5. large-scale nonsmooth optimization
  6. large-scale systems of nonlinear equations
  7. limited-memory methods
  8. partially separable problems
  9. primal interior-point methods
  10. quasi-Newton methods
  11. sparse problems

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)3
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2021)Two limited-memory optimization methods with minimum violation of the previous secant conditionsComputational Optimization and Applications10.1007/s10589-021-00318-yOnline publication date: 12-Sep-2021
  • (2020) A Consistent Scheme for Gradient-Based Optimization of Protein – Ligand Poses Journal of Chemical Information and Modeling10.1021/acs.jcim.0c0109560:12(6502-6522)Online publication date: 1-Dec-2020
  • (2019)A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directionsJournal of Computational and Applied Mathematics10.1016/j.cam.2018.10.054351(14-28)Online publication date: May-2019
  • (2019)Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimizationNumerical Algorithms10.1007/s11075-018-0513-380:3(957-987)Online publication date: 1-Mar-2019
  • (2016) Cell division plane orientation based on tensile stress in Arabidopsis thaliana Proceedings of the National Academy of Sciences10.1073/pnas.1600677113113:30Online publication date: 19-Jul-2016
  • (2015)A modified limited-memory BNS method for unconstrained minimization based on the conjugate directions ideaOptimization Methods & Software10.1080/10556788.2014.95510130:3(616-633)Online publication date: 1-May-2015
  • (2014)On the behaviour of constrained optimization methods when Lagrange multipliers do not existOptimization Methods & Software10.1080/10556788.2013.84169229:3(646-657)Online publication date: 1-May-2014
  • (2014)Efficient tridiagonal preconditioner for the matrix-free truncated Newton methodApplied Mathematics and Computation10.1016/j.amc.2014.03.006235(394-407)Online publication date: May-2014

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media