Skip to main content

On Low Complexity Acceleration Techniques for Randomized Optimization

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8672))

Abstract

Recently it was shown by Nesterov (2011) that techniques form convex optimization can be used to successfully accelerate simple derivative-free randomized optimization methods. The appeal of those schemes lies in their low complexity, which is only Θ(n) per iteration—compared to Θ(n 2) for algorithms storing second-order information or covariance matrices. From a high-level point of view, those accelerated schemes employ correlations between successive iterates—a concept looking similar to the evolution path used in Covariance Matrix Adaptation Evolution Strategies (CMA-ES). In this contribution, we (i) implement and empirically test a simple accelerated random search scheme (SARP). Our study is the first to provide numerical evidence that SARP can effectively be implemented with adaptive step size control and does not require access to gradient or advanced line search oracles. We (ii) try to empirically verify the supposed analogy between the evolution path and SARP. We propose an algorithm CMA-EP that uses only the evolution path to bias the search. This algorithm can be generalized to a family of low memory schemes, with complexity Θ(mn) per iteration, following a recent approach by Loshchilov (2014). The study shows that the performance of CMA-EP heavily depends on the spectra of the objective function and thus it cannot accelerate as consistently as SARP.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Polyak, B.: Introduction to Optimization. Optimization Software - Inc. (1987)

    Google Scholar 

  2. Nesterov, Y.: Introductory Lectures on Convex Optimization. Kluwer (2004)

    Google Scholar 

  3. Broyden, C.G.: The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations. IMA J. of Appl. Math. 6(1), 76–90 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  4. Fletcher, R.: A new approach to variable metric algorithms. The Computer Journal 13(3), 317–322 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  5. Goldfarb, D.: A Family of Variable-Metric Methods Derived by Variational Means. Mathematics of Computation 24(109), 23–26 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  6. Nocedal, J.: Updating Quasi-Newton Matrices with Limited Storage. Mathematics of Computation 35(151), 773–782 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  7. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Mathematical Programming 45(1-3), 503–528 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  8. Nesterov, Y.: A method of solving a convex programming problem with convergence rate O(1/k 2). Soviet Mathematics Doklady 27(2), 372–376 (1983)

    MATH  Google Scholar 

  9. Nesterov, Y.: Smoothing technique and its applications in semidefinite optimization. Mathematical Programming 110(2), 245–259 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  10. Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization. Submitted to SIAM Journal on Optimization (2008)

    Google Scholar 

  11. Schumer, M., Steiglitz, K.: Adaptive step size random search. IEEE Transactions on Automatic Control 13(3), 270–276 (1968)

    Article  Google Scholar 

  12. Rechenberg, I.: Evolutionsstrategie; Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog (1973)

    Google Scholar 

  13. Mutseniyeks, V.A., Rastrigin, L.A.: Extremal control of continuous multi-parameter systems by the method of random search. Eng.Cyb. 1, 82–90 (1964)

    Google Scholar 

  14. Stich, S.U., Müller, C.L., Gärtner, B.: Optimization of convex functions with Random Pursuit. SIAM Journal on Optimization 23(2), 1284–1309 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  15. Nesterov, Y.: Random Gradient-Free Minimization of Convex Functions. Technical report, ECORE (2011)

    Google Scholar 

  16. Leventhal, D., Lewis, A.S.: Randomized Hessian estimation and directional search. Optimization 60(3), 329–345 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  17. Stich, S.U., Gärtner, B., Müller, C.L.: Variable Metric Random Pursuit (2012) (submitted), http://arxiv.org/abs/1210.5114

  18. Hansen, N., Ostermeier, A.: Completely Derandomized Self-Adaption in Evolution Strategies. Evolutionary Computation 9(2), 159–195 (2001)

    Article  Google Scholar 

  19. Hansen, N., Muller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 11(1), 1–18 (2003)

    Article  Google Scholar 

  20. Knight, J.N., Lunacek, M.: Reducing the Space-time Complexity of the CMA-ES. In: GECCO 2007, pp. 658–665. ACM (2007)

    Google Scholar 

  21. Loshchilov, I.: A Computationally Efficient Limited Memory CMA-ES for Large Scale Optimization. To appear GECCO (2014), http://arxiv.org/abs/1404.5520

  22. Lee, Y.T., Sidford, A.: Efficient accelerated coordinate descent methods and faster algorithms for solving linear systems. In: FOCS, pp. 147–156. IEEE (2013)

    Google Scholar 

  23. Ostermeier, A., Gawelczyk, A., Hansen, N.: Step-size adaptation based on non-local use of selection information. In: Davidor, Y., Männer, R., Schwefel, H.-P. (eds.) PPSN 1994. LNCS, vol. 866, pp. 189–198. Springer, Heidelberg (1994)

    Chapter  Google Scholar 

  24. Igel, C., Suttorp, T., Hansen, N.: A Computational Efficient Covariance Matrix Update and a (1+1)-CMA for Evolution Strategies. In: GECCO, pp. 453–460 (2006)

    Google Scholar 

  25. Sun, Y., Schaul, T., Gomez, F., Schmidhuber, J.: A linear time natural evolution strategy for non-separable functions. In: Proc. 15th Genetic and Evolutionary Computation Conference Companion, pp. 61–62. ACM (2013)

    Google Scholar 

  26. Stich, S.U.: Supplementary Online Mat (2014), http://arxiv.org/abs/1406.2010

  27. Stich, S.U., Müller, C.L.: On Spectral Invariance of Randomized Hessian and Covariance Matrix Adaptation Schemes. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012, Part I. LNCS, vol. 7491, pp. 448–457. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Stich, S.U. (2014). On Low Complexity Acceleration Techniques for Randomized Optimization. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds) Parallel Problem Solving from Nature – PPSN XIII. PPSN 2014. Lecture Notes in Computer Science, vol 8672. Springer, Cham. https://doi.org/10.1007/978-3-319-10762-2_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-10762-2_13

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-10761-5

  • Online ISBN: 978-3-319-10762-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics