Skip to main content

Training Parameterized Quantum Circuits with Triplet Loss

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2022)

Abstract

Training parameterized quantum circuits (PQCs) is a growing research area that has received a boost from the emergence of new hybrid quantum classical algorithms and Quantum Machine Learning (QML) to leverage the power of today’s quantum computers. However, a universal pipeline that guarantees good learning behavior has not yet been found, due to several challenges. These include in particular the low number of qubits and their susceptibility to noise but also the vanishing of gradients during training. In this work, we apply and evaluate Triplet Loss in a QML training pipeline utilizing a PQC for the first time. We perform extensive experiments for the Triplet Loss based setup and training on two common datasets, the MNIST and moon dataset. Without significant fine-tuning of training parameters and circuit layout, our proposed approach achieves competitive results to a regular training. Additionally, the variance and the absolute values of gradients are significantly better compared to training a PQC without Triplet Loss. The usage of metric learning proves to be suitable for QML and its high dimensional space as it is not as restrictive as learning on hard labels. Our results indicate that metric learning provides benefits to mitigate the so-called barren plateaus.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Moon dataset. https://scikit-learn.org/stable/modules/generated/sklearn.datasets.make_moons.html. Accessed 04 Apr 2022

  2. Benedetti, M., Lloyd, E., Sack, S., Fiorentini, M.: Parameterized quantum circuits as machine learning models. Quant. Sci. Technol. 4(4), 043001 (2019). https://doi.org/10.1088/2058-9565/ab4eb5

    Article  Google Scholar 

  3. Bilkis, M., Cerezo, M., Verdon, G., Coles, P.J., Cincio, L.: A semi-agnostic ansatz with variable structure for quantum machine learning (2021). https://arxiv.org/abs/2103.06712

  4. Cerezo, M., et al.: Variational quantum algorithms. nature reviews. Physics 3(9), 625–644 (2021). https://doi.org/10.1038/s42254-021-00348-9

    Article  Google Scholar 

  5. Cerezo, M., Sone, A., Volkoff, T., Cincio, L., Coles, P.J.: Cost function dependent barren plateaus in shallow parametrized quantum circuits. Nature Commun. 12(1), 1791 (2021). https://doi.org/10.1038/s41467-021-21728-w

    Article  Google Scholar 

  6. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995). https://doi.org/10.1007/BF00994018

    Article  MATH  Google Scholar 

  7. Dunjko, V., Wittek, P.: A non-review of quantum machine learning: trends and explorations. Quantum Views 4, 32 (2020)

    Article  Google Scholar 

  8. Grant, E., et al.: Hierarchical quantum classifiers. NPJ Quantum Inf. 4(1), 65 (2018). https://doi.org/10.1038/s41534-018-0116-9

    Article  MathSciNet  Google Scholar 

  9. Grant, E., Wossnig, L., Ostaszewski, M., Benedetti, M.: An initialization strategy for addressing barren plateaus in parametrized quantum circuits. Quantum 3, 214 (2019). https://doi.org/10.22331/q-2019-12-09-214

    Article  Google Scholar 

  10. Hettinger, C., Christensen, T., Ehlert, B., Humpherys, J., Jarvis, T., Wade, S.: Forward thinking: building and training neural networks one layer at a time (2017)

    Google Scholar 

  11. Holmes, Z., Sharma, K., Cerezo, M., Coles, P.J.: Connecting ansatz expressibility to gradient magnitudes and barren plateaus. PRX Quantum 3, 010313, Published 24 January 2022 (2021). https://doi.org/10.1103/PRXQuantum.3.010313, https://arxiv.org/abs/2101.02138

  12. Kaya, M., Bilge, H.Ş.: Deep metric learning: a survey. Symmetry 11(9), 1066 (2019)

    Google Scholar 

  13. Kulis, B., et al.: Metric learning: a survey. Found. Trends® Mach. Learn. 5(4), 287–364 (2013)

    Google Scholar 

  14. LaRose, R., Coyle, B.: Robust data encodings for quantum classifiers. Phys. Rev. A 102, 032420 (2020). https://doi.org/10.1103/PhysRevA.102.032420, https://arxiv.org/abs/2003.01695

  15. LeCun, Y., Cortes, C.: MNIST handwritten digit database (2010). https://yann.lecun.com/exdb/mnist/

  16. Lloyd, S., Schuld, M., Ijaz, A., Izaac, J., Killoran, N.: Quantum embeddings for machine learning. arXiv preprint arXiv:2001.03622 (2020)

  17. McClean, J.R., Boixo, S., Smelyanskiy, V.N., Babbush, R., Neven, H.: Barren plateaus in quantum neural network training landscapes. Nat. Commun. 9(1), 1–6 (2018)

    Article  Google Scholar 

  18. McClean, J.R., Romero, J., Babbush, R., Aspuru-Guzik, A.: The theory of variational hybrid quantum-classical algorithms. New J. Phys. 18(2), 023023 (2016)

    Article  MATH  Google Scholar 

  19. Nielsen, M.A., Chuang, I.L.: Quantum Computation and Quantum Information. Cambridge University Press, Cambridge (2000)

    MATH  Google Scholar 

  20. Pedregosa, F., et al.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  21. Pesah, A., Cerezo, M., Wang, S., Volkoff, T., Sornborger, A.T., Coles, P.J.: Absence of barren plateaus in quantum convolutional neural networks. Phys. Rev. X 11, 041011 (2021). https://doi.org/10.1103/PhysRevX.11.041011

    Article  Google Scholar 

  22. Preskill, J.: Quantum computing in the NISQ era and beyond. Quantum 2, 79 (2018)

    Article  Google Scholar 

  23. Schroff, F., Kalenichenko, D., Philbin, J.: FaceNet: a unified embedding for face recognition and clustering. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 815–823 (2015)

    Google Scholar 

  24. Schuld, M.: Effect of data encoding on the expressive power of variational quantum-machine-learning models. Phys. Rev. A 103(3), 032430 (2021). https://doi.org/10.1103/PhysRevA.103.032430

    Article  MathSciNet  Google Scholar 

  25. Schuld, M., Bergholm, V., Gogolin, C., Izaac, J., Killoran, N.: Evaluating analytic gradients on quantum hardware. Phys. Rev. A 99(3), 032331 (2019)

    Article  Google Scholar 

  26. Skolik, A., McClean, J.R., Mohseni, M., van der Smagt, P., Leib, M.: Layerwise learning for quantum neural networks. Quantum Mach. Intell. 3(1), 1–11 (2021). https://doi.org/10.1007/s42484-020-00036-4

    Article  Google Scholar 

  27. Thumwanit, N., Lortaraprasert, C., Yano, H., Raymond, R.: Trainable discrete feature embeddings for variational quantum classifier (2021). https://arxiv.org/abs/2106.09415

  28. Wecker, D., Hastings, M.B., Troyer, M.: Progress towards practical quantum variational algorithms. Phys. Rev. A 92(4), 042303 (2015)

    Article  Google Scholar 

  29. Wendenius, C., Kuehn, E.: Quantum-triplet-loss, July 2022. https://doi.org/10.5281/zenodo.6786443

  30. Xuan, H., Stylianou, A., Liu, X., Pless, R.: Hard negative examples are hard, but useful. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12359, pp. 126–142. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58568-6_8

    Chapter  Google Scholar 

  31. Yu, B., Liu, T., Gong, M., Ding, C., Tao, D.: Correcting the triplet selection bias for triplet loss. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11210, pp. 71–86. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01231-1_5

    Chapter  Google Scholar 

Download references

Acknowledgements

The authors acknowledge support by the state of Baden-Württemberg through bwHPC.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eileen Kuehn .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wendenius, C., Kuehn, E., Streit, A. (2023). Training Parameterized Quantum Circuits with Triplet Loss. In: Amini, MR., Canu, S., Fischer, A., Guns, T., Kralj Novak, P., Tsoumakas, G. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2022. Lecture Notes in Computer Science(), vol 13717. Springer, Cham. https://doi.org/10.1007/978-3-031-26419-1_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-26419-1_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-26418-4

  • Online ISBN: 978-3-031-26419-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics