A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies
2020
Findings of the Association for Computational Linguistics: EMNLP 2020
unpublished
In this paper, we investigate the following two limitations for the existing distractor generation (DG) methods. First, the quality of the existing DG methods are still far from practical use. There are still room for DG quality improvement. Second, the existing DG designs are mainly for single distractor generation. However, for practical MCQ preparation, multiple distractors are desired. Aiming at these goals, in this paper, we present a new distractor generation scheme with multi-tasking and
doi:10.18653/v1/2020.findings-emnlp.393
fatcat:bwdda2nlljdbbjhavmxjkd5i5i