Skip to main content

Efficient Line Search Method Based on Regression and Uncertainty Quantification

  • Conference paper
  • First Online:
Learning and Intelligent Optimization (LION 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14990))

Included in the following conference series:

  • 205 Accesses

Abstract

Unconstrained optimization problems are typically solved using iterative methods, which often depend on line search techniques to determine optimal step lengths in each iteration. This paper introduces a novel line search approach. Traditional line search methods, aimed at determining optimal step lengths, often discard valuable data from the search process and focus on refining step length intervals. This paper proposes a more efficient method using Bayesian optimization, which utilizes all available data points, i.e., function values and gradients, to guide the search towards a potential global minimum. This new approach more effectively explores the search space, leading to better solution quality. It is also easy to implement and integrate into existing frameworks. Tested on the challenging CUTEst test set, it demonstrates superior performance compared to existing state-of-the-art methods, solving more problems to optimality with equivalent resource usage.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Armijo, L.: Minimization of functions having Lipschitz continuous first partial derivatives. Pac. J. Math. 16(1), 1–3 (1966)

    Article  MathSciNet  MATH  Google Scholar 

  2. Barzilai, J., Borwein, J.M.: Two-Point Step Size Gradient Methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Google Scholar 

  3. Giesen, J., Jaggi, M., Laue, S.: Regularization paths with guarantees for convex semidefinite optimization. In: International Conference on Artificial Intelligence and Statistics (AISTATS) (2012)

    Google Scholar 

  4. Giesen, J., Laue, S.: Combining ADMM and the augmented lagrangian method for efficiently handling many constraints. In: International Joint Conference on Artificial Intelligence (IJCAI) (2019)

    Google Scholar 

  5. Gould, N.I.M., Orban, D., Toint, P.L.: Cutest: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60(3), 545–557 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  6. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  7. Laue, S., Mitterreiter, M., Giesen, J.: Geno – generic optimization for classical machine learning. In: Advances in Neural Information Processing Systems (NeurIPS), vol. 32 (2019)

    Google Scholar 

  8. Laue, S., Blacher, M., Giesen, J.: Optimization for classical machine learning problems on the GPU. In: Conference on Artificial Intelligence (AAAI) (2022)

    Google Scholar 

  9. Mockus, J.: On bayes methods for seeking an extremum. Avtomatika i Vychislitelnaja Technika 3, 53–62 (1972)

    MATH  Google Scholar 

  10. Moré, J.J., Thuente, D.J.: Line search algorithms with guaranteed sufficient decrease. ACM Trans. Math. Softw. 20(3), 286–307 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  11. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press (2006)

    Google Scholar 

  12. Sra, S., Nowozin, S., Wright, S.J.: Optimization for Machine Learning. MIT Press (2012)

    Google Scholar 

  13. Virtanen, P., et aL.: SciPy 1.0 contributors: scipy 1.0: fundamental algorithms for scientific computing in python. Nat. Methods 17, 261–272 (2020)

    Google Scholar 

  14. Wächter, A., Biegler, L.T.: Line search filter methods for nonlinear programming: motivation and global convergence. SIAM J. Optim. 16(1), 1–31 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  15. Wächter, A., Biegler, L.T.: On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming. Math. Program. 106(1), 25–57 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  16. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  17. Wu, J., Poloczek, M., Wilson, A.G., Frazier, P.: Bayesian optimization with gradients. In: Advances in Neural Information Processing Systems (NeurIPS) (2017)

    Google Scholar 

  18. Zhu, C., Byrd, R.H., Lu, P., Nocedal, J.: Algorithm 778: L-BFGS-B: fortran subroutines for large-scale bound-constrained optimization. ACM Trans. Math. Softw. 23(4), 550–560 (1997)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tomislav Prusina .

Editor information

Editors and Affiliations

Ethics declarations

Disclosure of Interests

The authors have no competing interests to declare that are relevant to the content of this article.

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Prusina, T., Laue, S. (2025). Efficient Line Search Method Based on Regression and Uncertainty Quantification. In: Festa, P., Ferone, D., Pastore, T., Pisacane, O. (eds) Learning and Intelligent Optimization. LION 2024. Lecture Notes in Computer Science, vol 14990. Springer, Cham. https://doi.org/10.1007/978-3-031-75623-8_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-75623-8_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-75622-1

  • Online ISBN: 978-3-031-75623-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics