QARR-FSQA: Question-Answer Replacement and Removal Pretraining Framework for Few-Shot Question Answering

Siao Wah Tan, Chin Poo Lee*, Kian Ming Lim*, Connie Tee, Ali Alqahtani

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In Natural Language Processing, creating training data for question answering (QA) systems typically requires significant effort and expertise. This challenge is amplified in few-shot scenarios where only a limited number of training samples are available. This paper proposes a novel pretraining framework to enhance few-shot question answering (FSQA) capabilities. It begins with the selection of the Discrete Reasoning Over the Content of Paragraphs (DROP) dataset, designed for English reading comprehension tasks involving various reasoning types. Data preprocessing converts question-answer pairs into a predefined template, consisting of a concatenated sequence of the question, a mask token with a prefix, and the context, forming the input sequence, while the target sequence includes the question and answer. The Question-Answer Replacement and Removal (QARR) technique augments the dataset by integrating the answer into the question and selectively removing words. Various templates for question-answer pairs are introduced. Models like BART, T5, and LED are then used to evaluate the framework's performance, undergoing further pretraining on the augmented dataset with their respective architectures and optimization objectives. The study also investigates the impact of different templates on model performance in few-shot QA tasks. Evaluated on three datasets in few-shot scenarios, the QARR-T5 method outperforms state-of-the-art FSQA techniques, achieving the highest F1 scores of 81.7% in 16-shot and 32-shot, 82.7% in 64-shot, and 84.5% in 128-shot on the SQuAD dataset. This demonstrates the framework's effectiveness in improving models' generalization and performance on new datasets with limited samples, advancing few-shot QA.

Original languageEnglish
Pages (from-to)159280-159295
Number of pages16
JournalIEEE Access
Volume12
DOIs
Publication statusPublished - 2024

Keywords

  • few-shot question answering
  • generative question answering models
  • Natural language processing
  • pretraining framework

Fingerprint

Dive into the research topics of 'QARR-FSQA: Question-Answer Replacement and Removal Pretraining Framework for Few-Shot Question Answering'. Together they form a unique fingerprint.

Cite this