Knowledge-Enriched Moral Understanding upon Continual Pre-training

Jing Qian, Yong Yue, Katie Atkinson, Gangmin Li

Research output: Contribution to conferencePaperpeer-review

Abstract

The aim of moral understanding is to comprehend the abstract concepts that hide in a story by seeing through concrete events and vivid characters. To be specific, the story is highly summarized in one sentence without covering any characters in the original story, which requires the machine to behave more intelligently with the abilities of moral perception and commonsense reasoning. The paradigm of “pre-training + fine-tuning” is generally accepted for applying neural language models. In this paper, we suggest adding an intermediate stage to build the flow of “pre-training + continual pre-training + finetuning”. Continual pre-training refers to further training on task-relevant or domainspecific corpora with the aim of bridging the data distribution gap between pre-training and fine-tuning. Experiments are basing on a new moral story dataset, STORAL-ZH, that composes of 4,209 Chinese story-moral pairs. We collect a moral corpus about Confucius theory to enrich the T5 model with moral knowledge. Furthermore, we leverage a Chinese commonsense knowledge graph to enhance the model with commonsense knowledge. Experimental results demonstrate the effectiveness of our method, compared with several state-of-the-art models including BERT-base, RoBERTa-base and T5-base.
Original languageEnglish
DOIs
Publication statusPublished - 20 Feb 2023
Event10th International Conference on Computer Networks & Communications - Vancouver, Canada
Duration: 22 Feb 202324 Feb 2023

Conference

Conference10th International Conference on Computer Networks & Communications
Abbreviated titleCCNET 2023
Country/TerritoryCanada
CityVancouver
Period22/02/2324/02/23

Keywords

  • Moral Understanding, Continual Pre-training, Knowledge Graph, Commonsense

Fingerprint

Dive into the research topics of 'Knowledge-Enriched Moral Understanding upon Continual Pre-training'. Together they form a unique fingerprint.

Cite this