Download PDFOpen PDF in browser

Downstream Transformer Generation of Question-Answer Pairs with Preprocessing and Postprocessing Pipelines

EasyChair Preprint no. 8739

8 pagesDate: August 29, 2022

Abstract

We present a method to perform a downstream task of transformers on generating question-answer pairs (QAPs) from a given article. We first finetune pretrained transformers on QAP datasets. We then use a preprocessing pipeline to select appropriate answers from the article, and feed each answer and the relevant context to the finetuned transformer to generate a candidate QAP. Finally we use a postprocessing pipeline to filter inadequate QAPs. In particular, using pretrained T5 models as transformers and the SQuAD dataset as the finetruning dataset, we obtain a finetuned T5 model that outperforms previous models on standard performance measures over the SQuAD dataset. We then show that our method based on this finetuned model generates a satisfactory number of QAPs with high qualities on the Gaokao-EN dataset assessed by human judges.

Keyphrases: Information Extraction, Natural Language Generation, Natural Language Processing, neural networks, question generation

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:8739,
  author = {Cheng Zhang and Hao Zhang and Yicheng Sun and Jie Wang},
  title = {Downstream Transformer Generation of Question-Answer Pairs with Preprocessing and Postprocessing Pipelines},
  howpublished = {EasyChair Preprint no. 8739},

  year = {EasyChair, 2022}}
Download PDFOpen PDF in browser