

Purpose:
SPARQL is a highly expressive query language for knowledge graphs; yet, formulating precise SPARQL queries can be challenging for non-expert users. A potential solution is translating natural questions into SPARQL queries, known as SPARQL generation. This paper addresses the challenges of translating natural language questions into SPARQL queries for different knowledge graphs.
Methodology:
We propose COT-SPARQL, our approach to generate SPARQL queries from input questions. Our approach employs Chain-of-thoughts prompting that guides large language models through intermediate reasoning steps and facilitates generating precise SPARQL queries. Furthermore, our approach incorporates entities and relations from the input question, and one-shot example in the prompt to provide additional context during the query generation process.
Findings:
We conducted several experiments on benchmark datasets and showed that our approach outperforms the state-of-the-art methods by a large margin. Our approach achieves a significant improvement in F1 score of 4.4% and 3.0% for the QALD-10 and QALD-9 datasets, respectively.
Value:
Our COT-SPARQL approach contributes to the semantic web community by simplifying access to knowledge graphs for non-expert users. In particular, COTSPARQL enables non-expert end-users to query knowledge graphs in natural languages, where COT-SPARQL converts user natural languages queries into SPARQL queries, which can be executed via the knowledge graph’s SPARQL endpoint.