Skip to main content
Log in

Computational Experience with a Safeguarded Barrier Algorithm for Sparse Nonlinear Programming

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

We describe an enhanced version of the primal-dual interior point algorithm in Lasdon, Plummer, and Yu (ORSA Journal on Computing, vol. 7, no. 3, pp. 321–332, 1995), designed to improve convergence with minimal loss of efficiency, and designed to solve large sparse nonlinear problems which may not be convex. New features include (a) a backtracking linesearch using an L1 exact penalty function, (b) ensuring that search directions are downhill for this function by increasing Lagrangian Hessian diagonal elements when necessary, (c) a quasi-Newton option, where the Lagrangian Hessian is replaced by a positive definite approximation (d) inexact solution of each barrier subproblem, in order to approach the central trajectory as the barrier parameter approaches zero, and (e) solution of the symmetric indefinite linear Newton equations using a multifrontal sparse Gaussian elimination procedure, as implemented in the MA47 subroutine from the Harwell Library (Rutherford Appleton Laboratory Report RAL-95-001, Oxfordshire, UK, Jan. 1995). Second derivatives of all problem functions are required when the true Hessian option is used. A Fortran implementation is briefly described. Computational results are presented for 34 smaller models coded in Fortran, where first and second derivatives are approximated by differencing, and for 89 larger GAMS models, where analytic first derivatives are available and finite differencing is used for second partials. The GAMS results are, to our knowledge, the first to show the performance of this promising class of algorithms on large sparse NLP's. For both small and large problems, both true Hessian and quasi- Newton options are quite reliable and converge rapidly. Using the true Hessian, INTOPT is as reliable as MINOS on the GAMS models, although not as reliable as CONOPT. Computation times are considerably longer than for the other 2 solvers. However, interior point methods should be considerably faster than they are here when analytic second derivatives are available, and algorithmic improvements and problem preprocessing should further narrow the gap.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. M. Argaez and R.A. Tapia, “On the global convergence of a modified augmented Lagrangian linesearch interior point Newton method for nonlinear programming, ” Technical Report TR95-29, Dept. of Computational and Applied Mathematics, Rice University, 1995.

  2. M. Breitfeld and D. Shanno, “Computational experience with modified log-barrier methods for nonlinear programming, ” Report RRR17-93, RUTCOR, Rutgers University, 1993.

  3. A. Brooke, D. Kendrick, and A. Meeraus, GAMS-A User's Guide, Boyd and Fraser Publishing Co.: Danvers, MA, 1988.

    Google Scholar 

  4. R. Byrd, J.C. Gilbert, and J. Nocedal, “A trust region method based on interior point techniques for nonlinear programming, ” Technical Report OTC 96/02, Optimization Technology Center, Argonne National Laboratory and Northwestern University, 1996.

  5. R.S. Dembo, “A set of geometric programming test problems and their solutions, ” Mathematical Programming, vol. 10, pp. 192–213, 1976.

    Google Scholar 

  6. A. Drud, “CONOPT-A large-scale GRG code, ” ORSA Journal on Computing, vol. 6, no. 2, pp. 207–216, 1994.

    Google Scholar 

  7. A. Drud, “A general pre-processor for GAMS models, ” Talk given at INFORMS San Diego Meeting, 5/4–5/7, 1997, Session MC33.1.

  8. A. Drud, “A general pre-processor for GAMS Models, ” Talk Presented at INFORMS San Diego Meeting, 5/4–5/7, 1997, Session MC33.1.

  9. I.S. Duff and J.K. Reid, “A fortran code for direct solution of indefinite sparse symmetric linear systems, ” Rutherford Appleton Laboratory Report RAL-95-001, Oxfordshire, UK, Jan. 1995.

  10. A.S. El-Bakry, R.A. Tapia, T. Tsuchiya, and Y. Zhang, “On the formulation and theory of the Newton interiorpoint method for nonlinear programming, ” Journal of Optimization Theory and Applications, vol. 89, no. 3, pp. 507–541, 1996.

    Google Scholar 

  11. A.V. Fiacco and G.P. McCormick, Nonlinear Programming: Sequential Unconstrained Minimization Techniques, John Wiley and Sons: New York, NY, 1968.

    Google Scholar 

  12. R. Fourer, D. Gay, and B. Kernighan, Ampl, A Modeling Language for Mathematical Programming, Boyd and Fraser Publishing Co.: Danvers, MA, 1993.

    Google Scholar 

  13. R.S. Gajulapalli, “INTOPT: An interior point algorithm for large scale nonlinear optimization, ” Doctoral Dissertation, University of Texas at Austin, 1995.

    Google Scholar 

  14. D. Gay, “More AD of nonlinear AMPL models: Computing Hessian information and exploiting partial separability, ” in Computational Differentiation, M. Berz, C. Bischof, G. Corliss, and A. Griewank (Eds.), SIAM, 1996.

  15. S. Granville, “Optimal reactive dispatch through interior point methods, ” Internal Report, CEPEL, Av. Hum Q. 5-Cidade Universitária, Ilha do Fundão, CEP 21941, Rio de Janeiro, R.J., Brazil, 1991.

  16. S. P. Han, “Superlinearly convergent variable metric algorithms for general nonlinear programming problems, ” Mathematical Programming, vol. 11, pp. 263–282, 1976.

    Google Scholar 

  17. D.M. Himmelblau, Applied Nonlinear Programming, McGraw-Hill: New York, 1972.

    Google Scholar 

  18. L.S. Lasdon, J. Plummer, and G. Yu, “Primal-dual and primal interior point algorithms for general nonlinear programs, ” ORSA Journal on Computing, vol. 7, no. 3, pp. 321–332, 1995.

    Google Scholar 

  19. D. Luenberger, Linear and Nonlinear Programming, Addison-Wesley Publishing Co.: Reading Mass, 1984.

    Google Scholar 

  20. B.A. Murtagh and M.A. Saunders, “A projected Lagrangian algorithm and its implementation for sparse nonlinear constraints, ” Mathematical Programming Study, vol. 16, pp. 84–117, 1982.

    Google Scholar 

  21. R. Polyak, “Modified barrier functions, ” Mathematical Programming, vol. 54, pp. 177–222, 1992.

    Google Scholar 

  22. M.J.D. Powell, “A fast algorithm for nonlinearly constrained optimization calculations, ” in Numerical Analysis, Dundee 1977, G.A. Watson (Ed.), Springer-Verlag: Berlin, 1977.

    Google Scholar 

  23. Y. Wu, A. Debs, and R. Marsten, “A direct nonlinear predictor-corrector primal-dual interior point algorithm for optimal power flows, ” IEEE Transactions on Power Systems, vol. 9, no. 2, pp. 876–883, 1994.

    Google Scholar 

  24. Y. Zhang and R.A. Tapia, “Superlinear and quadratic convergence of primal-dual interior-point methods for linear programming revisited, ” Report TR91-27, Department of Mathematical Sciences, Rice University, 1991.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Gajulapalli, R.S., Lasdon, L. Computational Experience with a Safeguarded Barrier Algorithm for Sparse Nonlinear Programming. Computational Optimization and Applications 19, 107–120 (2001). https://doi.org/10.1023/A:1011276420546

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1011276420546

Navigation