TransKGQA: Enhanced Knowledge Graph Question Answering With Sentence Transformers

You Li Chong, Chin Poo Lee*, Shahrin Zen Muhd-Yassin, Kian Ming Lim, Ahmad Kamsani Samingan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Knowledge Graph Question Answering (KGQA) plays a crucial role in extracting valuable insights from interconnected information. Existing methods, while commendable, face challenges such as contextual ambiguity and limited adaptability to diverse knowledge domains. This paper introduces TransKGQA, a novel approach addressing these challenges. Leveraging Sentence Transformers, TransKGQA enhances contextual understanding, making it adaptable to various knowledge domains. The model employs question-answer pair augmentation for robustness and introduces a threshold mechanism for reliable answer retrieval. TransKGQA overcomes limitations in existing works by offering a versatile solution for diverse question types. Experimental results, notably with the sentence-transformers/all-MiniLM-L12-v2 model, showcase remarkable performance with an F1 score of 78%. This work advances KGQA systems, contributing to knowledge graph construction, enhanced question answering, and automated Cypher query execution.

Original languageEnglish
Pages (from-to)74872-74887
Number of pages16
JournalIEEE Access
Volume12
DOIs
Publication statusPublished - 2024
Externally publishedYes

Keywords

  • knowledge graph
  • machine learning
  • natural language processing
  • Neo4j
  • Question answering
  • sentence transformer

Fingerprint

Dive into the research topics of 'TransKGQA: Enhanced Knowledge Graph Question Answering With Sentence Transformers'. Together they form a unique fingerprint.

Cite this