Skip to main content

Boosting Quantum Annealing Performance Using Evolution Strategies for Annealing Offsets Tuning

  • Conference paper
  • First Online:
Quantum Technology and Optimization Problems (QTOP 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11413))

Included in the following conference series:

Abstract

In this paper we introduce a novel algorithm to iteratively tune annealing offsets for qubits in a D-Wave 2000Q quantum processing unit (QPU). Using a (1+1)-CMA-ES algorithm, we are able to improve the performance of the QPU by up to a factor of 12.4 in probability of obtaining ground states for small problems, and obtain previously inaccessible (i.e., better) solutions for larger problems. We also make efficient use of QPU samples as a resource, using 100 times less resources than existing tuning methods. The success of this approach demonstrates how quantum computing can benefit from classical algorithms, and opens the door to new hybrid methods of computing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Each qubit in the QPU has a different range in \(\varDelta s\) that can be set independently. \(\varDelta s~<~0\) is an advance in time and \(\varDelta s~>~0\) is a delay. A typical range for \(\varDelta s\) is \(\pm ~0.15\). The total annealing time for all qubits is still \(|s|=1\) (or \(\tau \) in units of time).

References

  1. King, J., Yarkoni, S., Nevisi, M.M., Hilton, J.P., McGeoch, C.C.: Benchmarking a quantum annealing processor with the time-to-target metric. arXiv:1508.05087 (2015)

  2. Bian, Z., Chudak, F., Israel, R.B., Lackey, B., Macready, W.G., Roy, A.: Mapping constrained optimization problems to quantum annealing with application to fault diagnosis. Front. ICT 3, 14 (2016)

    Article  Google Scholar 

  3. Neukart, F., Compostella, G., Seidel, C., von Dollen, D., Yarkoni, S., Parney, B.: Traffic flow optimization using a quantum annealer. Front. ICT 4, 29 (2017)

    Article  Google Scholar 

  4. Raymond, J., Yarkoni, S., Andriyash, E.: Global warming: temperature estimation in annealers. Front. ICT 3, 23 (2016)

    Article  Google Scholar 

  5. Venturelli, D., Marchand, D.J.J., Rojo, G.: Quantum annealing implementation of job-shop scheduling. arXiv:1506.08479 (2015)

  6. King, A.D., et al.: Observation of topological phenomena in a programmable lattice of 1,800 qubits. Nature 560(7719), 456–460 (2018)

    Article  Google Scholar 

  7. Yarkoni, S., Plaat, A., Bäck, T.: First results solving arbitrarily structured maximum independent set problems using quantum annealing. In: 2018 IEEE Congress on Evolutionary Computation (CEC), (Rio de Janeiro, Brazil), pp. 1184–1190 (2018)

    Google Scholar 

  8. Johnson, M.W., et al.: Quantum annealing with manufactured spins. Nature 473, 194–198 (2011)

    Article  Google Scholar 

  9. Barahona, F.: On the computational complexity of Ising spin glass models. J. Phys. A: Math. Gen. 15(10), 3241 (1982)

    Article  MathSciNet  Google Scholar 

  10. Lucas, A.: Ising formulations of many NP problems. Front. Phys. 2, 5 (2014)

    Article  Google Scholar 

  11. Lanting, T., King, A.D., Evert, B., Hoskinson, E.: Experimental demonstration of perturbative anticrossing mitigation using non-uniform driver Hamiltonians. arXiv:1708.03049 (2017)

  12. Andriyash, E., Bian, Z., Chudak, F., Drew-Brook, M., King, A.D., Macready, W.G., Roy, A.: Boosting integer factoring performance via quantum annealing offsets https://www.dwavesys.com/resources/publications

  13. Hsu, T.-J., Jin, F., Seidel, C., Neukart, F., Raedt, H.D., Michielsen, K.: Quantum annealing with anneal path control: application to 2-sat problems with known energy landscapes. arXiv:1810.00194 (2018)

  14. Susa, Y., Yamashiro, Y., Yamamoto, M., Nishimori, H.: Exponential speedup of quantum annealing by inhomogeneous driving of the transverse field. J. Phys. Soc. Jpn. 87(2), 023002 (2018)

    Article  Google Scholar 

  15. Kadowaki, T., Nishimori, H.: Quantum annealing in the transverse ising model. Phys. Rev. E 58, 5355–5363 (1998)

    Article  Google Scholar 

  16. Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J.A., Larrañaga, P., Inza, I., Bengoetxea, E. (eds.) Towards a New Evolutionary Computation: Advances in the Estimation of Distribution Algorithms, pp. 75–102. Springer, Heidelberg (2006). https://doi.org/10.1007/3-540-32494-1_4

    Chapter  Google Scholar 

  17. Igel, C., Suttorp, T., Hansen, N.: A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, GECCO 2006, pp. 453–460. ACM, New York (2006)

    Google Scholar 

  18. Auger, A., Hansen, N.: Benchmarking the (1+1)-CMA-ES on the BBOB-2009 Noisy Testbed. In: Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, GECCO 2009, pp. 2467–2472. ACM, New York (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sheir Yarkoni .

Editor information

Editors and Affiliations

Appendices

A Evaluating the Fitness Function of \((1+1)\)-CMA-ES using the QPU

Here we show an example of a single tuning run of \((1+1)\)-CMA-ES for a 40 node graph, with the configuration of initial offsets set to all zeroes. As explained in Algorithm 1, we use the mean energy of 100 samples returned by the QPU as the evaluation for the fitness function of the CMA-ES. The sample size of 100 was determined empirically as being the minimum number of samples to determine the mean energy, and is consistent with previous results [4]. In Fig. 3 (left) we show the progression of the CMA-ES routine and the associated fitness function. The tuning shows a clear improvement in mean energy, as shown both in the fitness function and the cumulative minimum of the fitness function. Every time the objective function improves, the respective annealing offsets that were used in that sample set are recorded. The evolution of the annealing offsets for this 40 variable instance is shown in Fig. 3 (right). The final offsets after tuning were then used to test their performance.

Fig. 3.
figure 3

Left: The fitness function evolution (mean energy of 100 samples) is shown as a function of the iteration number in CMA-ES. The red line represents the value of the fitness function at each iteration of CMA-ES, and the blue line is the cumulative minimum, representing the best solutions so far. Right: The evolution of the annealing offsets are shown as a function of the iteration number of CMA-ES (updated every time improvement is found by the CMA-ES). (Color figure online)

B Analysis of Tuned Annealing Offsets

Here we present the aggregated results of all the annealing offsets post tuning. Figures 4 (left and right) show the final offset values for all problem instances using the CMA-ES routine with initial offsets set to zero and uniform, respectively. We found that the final offset values were not correlated with chain length or success probability. However, we did see a systematic shift in the final offset values with respect to the degree of the node in the graph, and as a function of problem size. In both figures, we see divergent behavior in offsets for very small and very high degree nodes, with consistent stability in the mid-range. The main different between the two figures is the final value of the offsets in this middle region. In Fig. 4 (left), the average offset value rises from 0 at small problems, to roughly .02 for problems with 40 variables, then back down to 0 for the largest problems. There is also a slight increase in average offset value from degree 3 to degree 14, found consistently for all problem sizes. In contrast, Fig. 4 (right) shows that the final offset values were roughly .02 at all problem sizes, apart from the divergent behavior in the extrema of the degree axis. The difference between the two configurations could explain why initial offsets set to zero performed slightly better than the uniform initial offsets. Given the fixed resources of 10,000 samples for calibration per MIS instance, escaping from a local optimum (such as the null initial configuration) becomes increasingly difficult at larger problem sizes, thus degrading the uniform configuration’s performance. Other than the results shown here, we were not able to extract any meaningful information with respect to other interesting parameters.

Fig. 4.
figure 4

Left: Final offset value as determined by (1+1)-CMA-ES with initial offsets set to zero, as a function of the degree of the logical node in the graph. Colors represent different problem sizes. Right: Same as in left, but for initial offsets set uniformly in their allowed range. (Color figure online)

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yarkoni, S., Wang, H., Plaat, A., Bäck, T. (2019). Boosting Quantum Annealing Performance Using Evolution Strategies for Annealing Offsets Tuning. In: Feld, S., Linnhoff-Popien, C. (eds) Quantum Technology and Optimization Problems. QTOP 2019. Lecture Notes in Computer Science(), vol 11413. Springer, Cham. https://doi.org/10.1007/978-3-030-14082-3_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-14082-3_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-14081-6

  • Online ISBN: 978-3-030-14082-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics