Skip to main content
Log in

A periodicity aware transformer for crystal property prediction

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Crystals constitute a variety of important materials from everyday life to cutting-edge fields. The properties of a crystal are determined by its structure, as demonstrated by physics theory, which is essential for understanding and designing materials. In recent years, deep learning-based methods have been proposed to predict crystal material properties and achieved satisfactory performance. However, these methods have not adequately accounted for the key composition of crystals, i.e., periodicity. To address this issue, we propose a periodicity aware crystal transformer (PACT), which uses hierarchical self-attention mechanisms to enforce periodicity constraints on the crystal structure. Specifically, it applies unit-wise self-attention and crystal-wise self-attention to ensure that the surroundings of atoms or unit cells at periodic distances are identical. Extensive benchmark experiments demonstrate that our proposed model exhibits superior performance, achieving an average improvement of 7.07% over previous methods. Additionally, ablation studies show both unit-wise self-attention and crystal-wise in the hierarchical self-attention mechanisms are effective in modeling the periodicity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Algorithm 1
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Data availability

The datasets used in this study are available from the corresponding author upon reasonable request.

References

  1. Koltun P (2010) Materials and sustainable development. Prog Nat Sci Mater Int 20:16–29

    Article  Google Scholar 

  2. Kittel C, McEuen P, McEuen P (1996) Introduction to solid state physics. Wiley, New York

    Google Scholar 

  3. LeSar R (2013) Introduction to computational materials science: fundamentals to applications. Cambridge University Press, New York

    Book  Google Scholar 

  4. Dybeck EC, Abraham NS, Schieber NP, Shirts MR (2017) Capturing entropic contributions to temperature-mediated polymorphic transformations through molecular modeling. Cryst Growth Des 17(4):1775–1787

    Article  CAS  Google Scholar 

  5. Schmidt J, Marques MR, Botti S, Marques MA (2019) Recent advances and applications of machine learning in solid-state materials science. npj Comput Mater 5(1):1–36

    Article  Google Scholar 

  6. Xie T, Grossman JC (2018) Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys Rev Lett 120(14):145301

    Article  ADS  CAS  PubMed  Google Scholar 

  7. Park CW, Wolverton C (2020) Developing an improved crystal graph convolutional neural network framework for accelerated materials discovery. Phys Rev Mater 4(6):063801

    Article  CAS  Google Scholar 

  8. Chen C, Ye W, Zuo Y, Zheng C, Ong SP (2019) Graph networks as a universal machine learning framework for molecules and crystals. Chem Mater 31(9):3564–3572

    Article  CAS  Google Scholar 

  9. Liu K, Yang K, Zhang J, Xu R (2022) S2snet: a pretrained neural network for superconductivity discovery. In: Raedt, L.D. (ed.) Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, pp 5101–5107. International Joint Conferences on Artificial Intelligence Organization, Messe Wien . https://doi.org/10.24963/ijcai.2022/708

  10. Young HD, Freedman RA (2015) University physics with modern physics and masteringphysics. Academic Imports Sweden AB, San Francisco

    Google Scholar 

  11. Griffiths DJ (2005) Introduction to electrodynamics. American Association of Physics Teachers, Maryland

    Google Scholar 

  12. Ward L, Agrawal A, Choudhary A, Wolverton C (2016) A general-purpose machine learning framework for predicting properties of inorganic materials. npj Comput Mater 2(1):1–7

    Article  Google Scholar 

  13. Curtarolo S, Setyawan W, Hart GL, Jahnatek M, Chepulskii RV, Taylor RH, Wang S, Xue J, Yang K, Levy O (2012) Aflow: an automatic framework for high-throughput materials discovery. Comput Mater Sci 58:218–226

    Article  CAS  Google Scholar 

  14. Stanev V, Oses C, Kusne AG, Rodriguez E, Paglione J, Curtarolo S, Takeuchi I (2018) Machine learning modeling of superconducting critical temperature. npj Comput Mater 4(1):1–14

    Article  Google Scholar 

  15. Hamidieh K (2018) A data-driven statistical model for predicting the critical temperature of a superconductor. Comput Mater Sci 154:346–354

    Article  CAS  Google Scholar 

  16. Li Q, Dong R, Fu N, Omee SS, Wei L, Hu J (2023) Global mapping of structures and properties of crystal materials. J Chem Inf Model 63(12):3814–3826

    Article  CAS  PubMed  Google Scholar 

  17. Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X, Unterthiner T, Dehghani M, Minderer M, Heigold G, Gelly S (2021) An image is worth 16x16 words: transformers for image recognition at scale. In: Proceedings of the 9th International Conference on Learning Representations. International Conference on Learning Representation Organizations, Virtual

  18. Bahdanau D, Cho K, Bengio Y (2015) Neural machine translation by jointly learning to align and translate. In: Bengio, Y., LeCun, Y. (eds.) Proceedings of the 3rd International Conference on Learning Representations. International Conference on Learning Representation Organizations, San Diego

  19. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. Adv Neural Inf Process Syst 30:1–11

    Google Scholar 

  20. Zhang X-C, Wu C-K, Yang Z-J, Wu Z-X, Yi J-C, Hsieh C-Y, Hou T-J, Cao D-S (2021) Mg-bert: leveraging unsupervised atomic representation learning for molecular property prediction. Brief Bioinform 22(6):152

    Article  Google Scholar 

  21. Wei L, Li Q, Song Y, Stefanov S, Siriwardane E, Chen F, Hu J (2022) Crystal transformer: self-learning neural language model for generative and tinkering design of materials. arXiv preprint arXiv:2204.11953

  22. Vinyals O, Bengio S, Kudlur M (2015) Order Matters: sequence to sequence for sets. arXiv e-prints, 1511–06391 https://ui.adsabs.harvard.edu/abs/2015arXiv151106391V

  23. Graves A (2012) Long short-term memory. Supervised sequence labelling with recurrent neural networks, 37–45

  24. Castelli IE, Landis DD, Thygesen KS, Dahl S, Chorkendorff I, Jaramillo TF, Jacobsen KW (2012) New cubic perovskites for one-and two-photon water splitting using the computational materials repository. Energy Environ Sci 5(10):9034–9043

    Article  CAS  Google Scholar 

  25. Jain A, Ong SP, Hautier G, Chen W, Richards WD, Dacek S, Cholia S, Gunter D, Skinner D, Ceder G (2013) Commentary: the materials project: a materials genome approach to accelerating materials innovation. APL Mater 1(1):011002

    Article  ADS  Google Scholar 

  26. Thölke P, De Fabritiis G (2022) Equivariant transformers for neural network-based molecular potentials. In: international conference on learning representations, Virtual, pp 1–20

  27. Zhou Q, Tang P, Liu S, Pan J, Yan Q, Zhang S-C (2018) Learning atoms for materials discovery. Proc Natl Acad Sci 115(28):6411–6417

    Article  ADS  Google Scholar 

  28. Hinton G, Roweis ST (2002) Stochastic neighbor embedding. In: NIPS, vol. 15, pp 833–840. Citeseer, Vancouver

  29. Rabe MN, Staats C (2021) Self-attention does not need o(n\({}^{\text{2}}\)) memory. CoRR arXiv preprint arXiv:2112.05682

  30. Wang S, Li BZ, Khabsa M, Fang H, Ma H (2020) Linformer: Self-attention with linear complexity. CoRR arXiv preprint arXiv:2006.04768

  31. Gupta A, Dar G, Goodman S, Ciprut D, Berant J (2021) Memory-efficient transformers via top-k attention. In: Moosavi NS, Gurevych I, Fan A, Wolf T, Hou Y, Marasovic A, Ravi S (eds.) Proceedings of the Second Workshop on Simple and Efficient Natural Language Processing, SustaiNLP@EMNLP 2021, pp 39–52. Association for Computational Linguistics, Virtual

Download references

Funding

The authors did not receive support from any organization for the submitted work. The authors have no relevant financial or non-financial interests to disclose.

Author information

Authors and Affiliations

Authors

Contributions

KL involved in conceptualization, methodology, validation, formal analysis, investigation, visualization, and writing—original draft. KY took part in methodology, experiments, visualization, and writing—review and editing. SG was reponsible for visualization, experiments, and writing—review and editing.

Corresponding author

Correspondence to Ke Liu.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest or competing interests.

Ethical approval

No ethical implications for the data.

Classification points

This submission can be classified into AI for science or AI for material science.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, K., Yang, K. & Gao, S. A periodicity aware transformer for crystal property prediction. Neural Comput & Applic 36, 6827–6838 (2024). https://doi.org/10.1007/s00521-024-09432-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-024-09432-4

Keywords

Navigation