Skip to main content

Transfer Learning for Larger, Broader, and Deeper Neural-Network Quantum States

  • Conference paper
  • First Online:
  • 841 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12924))

Abstract

Neural-network quantum states are a family of unsupervised neural network models simulating quantum many-body systems. We investigate the efficiency and effectiveness of neural-network quantum states with deep restricted Boltzmann machine with different sizes, breadths, and depths. We propose and evaluate several transfer learning protocols for the improvement of scalability, effectiveness, and efficiency of neural-network quantum states with different numbers of visible nodes, hidden nodes per layer, and hidden layers. The results of a comparative empirical performance evaluation confirm the advantages of deep neural-network quantum states and of the proposed transfer learning protocols.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Ba, L.J., Caruana, R.: Do deep nets really need to be deep? In: Proceedings of the 27th International Conference on Neural Information Processing Systems, vol. 2, pp. 2654–2662 (2014)

    Google Scholar 

  2. Cai, Z., Liu, J.: Approximating quantum many-body wave functions using artificial neural networks. Phys. Rev. B 97(3), 035116 (2018)

    Google Scholar 

  3. Carleo, G., Troyer, M.: Solving the quantum many-body problem with artificial neural networks. Science 355(6325), 602–606 (2017)

    Article  MathSciNet  Google Scholar 

  4. Chakrabarti, B.K., Dutta, A., Sen, P.: Quantum Ising Phases and Transitions in Transverse Ising Models, vol. 41. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-49865-0

  5. Chen, S.Y.C., Yoo, S.: Federated quantum machine learning. Entropy 23(4), 460 (2021)

    Article  Google Scholar 

  6. Choo, K., Carleo, G., Regnault, N., Neupert, T.: Symmetries and many-body excitations with neural-network quantum states. Phys. Rev. Lett. 121(16), 167204 (2018)

    Google Scholar 

  7. Choo, K., Neupert, T., Carleo, G.: Two-dimensional frustrated J1–J2 model studied with neural network quantum states. Phys. Rev. B 100(12), 125124 (2019)

    Google Scholar 

  8. Deng, D.L., Li, X., Sarma, S.D.: Machine learning topological states. Phys. Rev. B 96(19), 195145 (2017)

    Google Scholar 

  9. Deng, D.L., Li, X., Sarma, S.D.: Quantum entanglement in neural network states. Phys. Rev. X 7(2), 021021 (2017)

    Google Scholar 

  10. Du, S., Lee, J., Li, H., Wang, L., Zhai, X.: Gradient descent finds global minima of deep neural networks. In: International Conference on Machine Learning, pp. 1675–1685. PMLR (2019)

    Google Scholar 

  11. Efthymiou, S., Beach, M.J., Melko, R.G.: Super-resolving the ising model with convolutional neural networks. Phys. Rev. B 99(7), 075113 (2019)

    Google Scholar 

  12. Gao, X., Duan, L.M.: Efficient representation of quantum many-body states with deep neural networks. Nat. Commun. 8(1), 1–6 (2017)

    Article  Google Scholar 

  13. Goodfellow, I., Bengio, Y., Courville, A., Bengio, Y.: Deep Learning, vol. 1. MIT Press, Cambridge (2016)

    Google Scholar 

  14. Gubernatis, J., Kawashima, N., Werner, P.: Quantum Monte Carlo Methods. Cambridge University Press, Cambridge (2016)

    Book  Google Scholar 

  15. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  16. Hibat-Allah, M., Ganahl, M., Hayward, L.E., Melko, R.G., Carrasquilla, J.: Recurrent neural network wave functions. Phys. Rev. Res. 2(2), 023358 (2020)

    Google Scholar 

  17. Hinton, G., Srivastava, N., Swersky, K.: Neural networks for machine learning lecture 6a overview of mini-batch gradient descent (2012)

    Google Scholar 

  18. Hinton, G.E.: A practical guide to training restricted Boltzmann machines. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 599–619. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_32

    Chapter  Google Scholar 

  19. Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006)

    Article  MathSciNet  Google Scholar 

  20. Hornik, K.: Approximation capabilities of multilayer feedforward networks. Neural Netw. 4(2), 251–257 (1991)

    Article  MathSciNet  Google Scholar 

  21. Huang, Y., Moore, J.E.: Neural network representation of tensor network and chiral states. arXiv:1701.06246 (2017)

  22. Jónsson, B., Bauer, B., Carleo, G.: Neural-network states for the classical simulation of quantum computing. arXiv:1808.05232 (2018)

  23. Le Roux, N., Bengio, Y.: Representational power of restricted Boltzmann machines and deep belief networks. Neural Comput. 20(6), 1631–1649 (2008)

    Article  MathSciNet  Google Scholar 

  24. Lecun, Y., Chopra, S., Hadsell, R., Ranzato, M.A., Huang, F.J.: A tutorial on energy-based learning. In: Predicting Structured Data. MIT Press (2006)

    Google Scholar 

  25. Liang, X., Liu, W.Y., Lin, P.Z., Guo, G.C., Zhang, Y.S., He, L.: Solving frustrated quantum many-particle models with convolutional neural networks. Phys. Rev. B 98(10), 104426 (2018)

    Google Scholar 

  26. Liu, N., Zaidi, N.A.: Artificial neural network: deep or broad? An empirical study. In: Kang, B.H., Bai, Q. (eds.) AI 2016. LNCS (LNAI), vol. 9992, pp. 535–541. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-50127-7_46

    Chapter  Google Scholar 

  27. Lu, S., Gao, X., Duan, L.M.: Efficient representation of topologically ordered states with restricted Boltzmann machines. Phys. Rev. B 99(15), 155136 (2019)

    Google Scholar 

  28. Mari, A., Bromley, T.R., Izaac, J., Schuld, M., Killoran, N.: Transfer learning in hybrid classical-quantum neural networks. Quantum 4, 340 (2020)

    Article  Google Scholar 

  29. Martin, G.: The effects of old learning on new in Hopfield and backpropagation nets. Microelectronics and Computer Technology Corporation (1988)

    Google Scholar 

  30. Montufar, G.F., Pascanu, R., Cho, K., Bengio, Y.: On the number of linear regions of deep neural networks. Adv. Neural Inf. Process. Syst. 27, 2924–2932 (2014)

    Google Scholar 

  31. Nielsen, M.A.: Neural Networks and Deep Learning, vol. 25. Determination Press, San Francisco (2015)

    Google Scholar 

  32. Orús, R.: A practical introduction to tensor networks: matrix product states and projected entangled pair states. Ann. Phys. 349, 117–158 (2014)

    Article  MathSciNet  Google Scholar 

  33. Pan, S.J., Yang, Q., et al.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2010)

    Article  Google Scholar 

  34. Poggio, T., Mhaskar, H., Rosasco, L., Miranda, B., Liao, Q.: Why and when can deep-but not shallow-networks avoid the curse of dimensionality: a review. Int. J. Autom. Comput. 14(5), 503–519 (2017)

    Article  Google Scholar 

  35. Pratt, L.Y.: Discriminability-based transfer between neural networks. In: Advances in Neural Information Processing Systems, pp. 204–211 (1993)

    Google Scholar 

  36. Raghu, M., Poole, B., Kleinberg, J., Ganguli, S., Sohl-Dickstein, J.: On the expressive power of deep neural networks. In: international Conference on Machine Learning, pp. 2847–2854. PMLR (2017)

    Google Scholar 

  37. Romero, A., Ballas, N., Kahou, S.E., Chassang, A., Gatta, C., Bengio, Y.: FitNets: hints for thin deep nets. arXiv preprint arXiv:1412.6550 (2014)

  38. Saito, H., Kato, M.: Machine learning technique to find quantum many-body ground states of bosons on a lattice. J. Phys. Soc. Jpn. 87(1), 014001 (2018)

    Google Scholar 

  39. Salakhutdinov, R., Hinton, G.: Deep Boltzmann machines. In: Artificial Intelligence and Statistics, pp. 448–455. PMLR (2009)

    Google Scholar 

  40. Sun, S., Chen, W., Wang, L., Liu, X., Liu, T.Y.: On the depth of deep neural networks: a theoretical view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 30 (2016)

    Google Scholar 

  41. Tan, M., Le, Q.: EfficientNet: rethinking model scaling for convolutional neural networks. In: International Conference on Machine Learning, pp. 6105–6114. PMLR (2019)

    Google Scholar 

  42. Thouless, D.J.: The quantum mechanics of many-body systems. Courier Corporation (2014)

    Google Scholar 

  43. Torlai, G., Mazzola, G., Carrasquilla, J., Troyer, M., Melko, R., Carleo, G.: Neural-network quantum state tomography. Nat. Phys. 14(5), 447–450 (2018)

    Article  Google Scholar 

  44. Weiss, K., Khoshgoftaar, T.M., Wang, D.D.: A survey of transfer learning. J. Big Data 3(1), 1–40 (2016). https://doi.org/10.1186/s40537-016-0043-6

    Article  Google Scholar 

  45. Yosinski, J., Clune, J., Bengio, Y., Lipson, H.: How transferable are features in deep neural networks? arXiv preprint arXiv:1411.1792 (2014)

  46. Zagoruyko, S., Komodakis, N.: Wide residual networks. arXiv preprint arXiv:1605.07146 (2016)

  47. Zen, R., et al.: Finding quantum critical points with neural-network quantum states. In: ECAI 2020–24th European Conference on Artificial Intelligence. Frontiers in Artificial Intelligence and Applications, vol. 325, pp. 1962–1969. IOS Press (2020)

    Google Scholar 

  48. Zen, R., et al.: Transfer learning for scalability of neural-network quantum states. Phys. Rev. E 101(5), 053301 (2020)

    Google Scholar 

  49. Zhang, Y.H., Jia, Z.A., Wu, Y.C., Guo, G.C.: An efficient algorithmic way to construct Boltzmann machine representations for arbitrary stabilizer code. arXiv:1809.08631 (2018)

Download references

Acknowledgment

This research is supported by the project “Complex quantum systems with neural networks: relaxation and quantum computing” (No. MOE-T2EP50120-0019) funded by the Singapore Ministry of Education. The computational work for this article was partially performed on resources of the National Supercomputing Centre (NSCC), Singapore (https://www.nscc.sg).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Remmy Zen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zen, R., Bressan, S. (2021). Transfer Learning for Larger, Broader, and Deeper Neural-Network Quantum States. In: Strauss, C., Kotsis, G., Tjoa, A.M., Khalil, I. (eds) Database and Expert Systems Applications. DEXA 2021. Lecture Notes in Computer Science(), vol 12924. Springer, Cham. https://doi.org/10.1007/978-3-030-86475-0_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-86475-0_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-86474-3

  • Online ISBN: 978-3-030-86475-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics