Fusing external knowledge resources for natural language understanding techniques: A survey

Yuqi Wang, Wei Wang*, Qi Chen, Kaizhu Huang, Anh Nguyen, Suparna De, Amir Hussain

*Corresponding author for this work

Research output: Contribution to journalShort surveypeer-review

10 Citations (Scopus)

Abstract

Knowledge resources, e.g. knowledge graphs, which formally represent essential semantics and information for logic inference and reasoning, can compensate for the unawareness nature of many natural language processing techniques based on deep neural networks. This paper provides a focused review of the emerging but intriguing topic that fuses quality external knowledge resources in improving the performance of natural language processing tasks. Existing methods and techniques are summarised in three main categories: (1) static word embeddings, (2) sentence-level deep learning models, and (3) contextualised language representation models, depending on when, how and where external knowledge is fused into the underlying learning models. We focus on the solutions to mitigate two issues: knowledge inclusion and inconsistency between language and knowledge. Details on the design of each representative method, as well as their strength and limitation, are discussed. We also point out some potential future directions in view of the latest trends in natural language processing research.

Original languageEnglish
Pages (from-to)190-204
Number of pages15
JournalInformation Fusion
Volume92
DOIs
Publication statusPublished - Apr 2023

Keywords

  • Deep learning
  • Knowledge fusion
  • Knowledge graph
  • Natural language understanding
  • Representation learning

Fingerprint

Dive into the research topics of 'Fusing external knowledge resources for natural language understanding techniques: A survey'. Together they form a unique fingerprint.

Cite this