Skip to main content

GADESQL: Graph Attention Diffusion Enhanced Text-To-SQL with Single and Multi-hop Relations

  • Conference paper
  • First Online:
Web Information Systems Engineering – WISE 2023 (WISE 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14306))

Included in the following conference series:

  • 726 Accesses

Abstract

Text-To-SQL is crucial for enabling users without technical expertise to effectively extract important information from databases. The graph-based encoder has been successfully employed in this field. However, existing methods often adopt a node-centric approach, focusing on single-hop edge relations. This approach gives rise to two main issues: 1) failure to differentiate between single-hop and multi-hop relations among nodes; 2) ignoring the valuable multi-hop reasoning information between nodes. To tackle these challenges, we propose a Graph Attention Diffusion Enhanced Text-To-SQL(GADESQL) model that enables multi-hop reasoning among nodes. With GAD, information can propagate efficiently through multi-hop paths, uniquely integrating single-hop and multi-hop relations during the graph iteration process. Furthermore, we employ Semantic Dependency Parsing for natural language analysis, constructing a semantic analysis tree for questions to enhance the effective connection between question tokens and Schema structures. Experiments on the cross-domain dataset Spider demonstrate that our model possesses strong generalization capabilities, achieving certain performance improvements over existing works.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Cai, R., Xu, B., Yang, X., Zhang, Z., Li, Z., Liang, Z.: An encoder-decoder framework translating natural language to database queries. arXiv preprint: arXiv:1711.06061 (2017)

  2. Cai, R., Yuan, J., Xu, B., Hao, Z.: SADGA: structure-aware dual graph aggregation network for Text-to-SQL. In: Advances in Neural Information Processing Systems, vol. 34, pp. –7676 (2021)

    Google Scholar 

  3. Cao, R., Chen, L., Chen, Z., Zhao, Y., Zhu, S., Yu, K.: LGESQL: line graph enhanced Text-to-SQL model with mixed local and non-local relations. arXiv preprint: arXiv:2106.01093 (2021)

  4. Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020)

    Google Scholar 

  5. Choi, D., Shin, M.C., Kim, E., Shin, D.R.: RYANSQL: recursively applying sketch-based slot fillings for complex text-to-SQL in cross-domain databases. Comput. Linguist. 47(2), 309–332 (2021)

    Google Scholar 

  6. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint: arXiv:1810.04805 (2018)

  7. Hui, B., et al.: Dynamic hybrid relation exploration network for cross-domain context-dependent semantic parsing. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 13116–13124 (2021)

    Google Scholar 

  8. Hui, B., et al.: S \(\hat{\,} 2\) SQL: injecting syntax to question-schema interaction graph encoder for Text-to-SQL parsers. arXiv preprint: arXiv:2203.06958 (2022)

  9. Hui, B., et al.: Improving Text-to-SQL with schema dependency learning. arXiv preprint: arXiv:2103.04399 (2021)

  10. Li, J., et al.: Graphix-t5: mixing pre-trained transformers with graph-aware layers for Text-to-SQL parsing. arXiv preprint: arXiv:2301.07507 (2023)

  11. Liu, Q., Yang, D., Zhang, J., Guo, J., Zhou, B., Lou, J.G.: Awakening latent grounding from pretrained language models for semantic parsing. arXiv preprint: arXiv:2109.10540 (2021)

  12. Liu, Y., Guan, R., Giunchiglia, F., Liang, Y., Feng, X.: Deep attention diffusion graph neural networks for text classification. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 8142–8152 (2021)

    Google Scholar 

  13. Oepen, S., et al.: SemEval 2014 task 8: broad-coverage semantic dependency parsing. In: International Workshop on Semantic Evaluation (2014)

    Google Scholar 

  14. Pennington, J., Socher, R., Manning, C.D.: GloVe: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)

    Google Scholar 

  15. Qi, J., et al.: RASAT: integrating relational structures into pretrained seq2seq model for Text-to-SQL. arXiv preprint: arXiv:2205.06983 (2022)

  16. Qi, P., Zhang, Y., Zhang, Y., Bolton, J., Manning, C.D.: Stanza: a Python natural language processing toolkit for many human languages. arXiv preprint: arXiv:2003.07082 (2020)

  17. Rubin, O., Berant, J.: SmBoP: semi-autoregressive bottom-up semantic parsing. arXiv preprint: arXiv:2010.12412 (2020)

  18. Shaw, P., Uszkoreit, J., Vaswani, A.: Self-attention with relative position representations. arXiv preprint: arXiv:1803.02155 (2018)

  19. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, vol. 27 (2014)

    Google Scholar 

  20. Wang, B., Shin, R., Liu, X., Polozov, O., Richardson, M.: RAT-SQL: relation-aware schema encoding and linking for Text-to-SQL parsers. In: Annual Meeting of the Association for Computational Linguistics (2019)

    Google Scholar 

  21. Wang, K., Shen, W., Yang, Y., Quan, X., Wang, R.: Relational graph attention network for aspect-based sentiment analysis. arXiv preprint: arXiv:2004.12362 (2020)

  22. Yaghmazadeh, N., Wang, Y., Dillig, I., Dillig, T.: SQLizer: query synthesis from natural language. Proc. ACM on Programm. Lang. 1(OOPSLA), 1–26 (2017)

    Google Scholar 

  23. Yin, P., Neubig, G.: A syntactic neural model for general-purpose code generation. arXiv preprint: arXiv:1704.01696 (2017)

  24. Yu, T., et al.: Spider: a large-scale human-labeled dataset for complex and cross-domain semantic parsing and Text-to-SQL task. In: EMNLP (2018)

    Google Scholar 

Download references

Acknowledgment

This work is supported by the National Science Fund of China under grant No. 62072075 and the Key research and development projects of Sichuan Provincial Science and Technology Plan Project under grant No. 2023YFS0420.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Rui Xi or Xiaowen Nie .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cao, Q., Xi, R., Wu, J., Nie, X., Liu, Y., Hou, M. (2023). GADESQL: Graph Attention Diffusion Enhanced Text-To-SQL with Single and Multi-hop Relations. In: Zhang, F., Wang, H., Barhamgi, M., Chen, L., Zhou, R. (eds) Web Information Systems Engineering – WISE 2023. WISE 2023. Lecture Notes in Computer Science, vol 14306. Springer, Singapore. https://doi.org/10.1007/978-981-99-7254-8_53

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-7254-8_53

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-7253-1

  • Online ISBN: 978-981-99-7254-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics