Machine Comprehension by Text-to-Text Neural Question Generation

Xingdi Yuan, Tong Wang, Caglar Gulcehre, Alessandro Sordoni, Philip Bachman, Saizheng Zhang, Sandeep Subramanian, Adam Trischler
2017 Proceedings of the 2nd Workshop on Representation Learning for NLP  
We propose a recurrent neural model that generates natural-language questions from documents, conditioned on answers. We show how to train the model using a combination of supervised and reinforcement learning. After teacher forcing for standard maximum likelihood training, we fine-tune the model using policy gradient techniques to maximize several rewards that measure question quality. Most notably, one of these rewards is the performance of a questionanswering system. We motivate question
more » ... ration as a means to improve the performance of question answering systems. Our model is trained and evaluated on the recent question-answering dataset SQuAD.
doi:10.18653/v1/w17-2603 dblp:conf/rep4nlp/YuanWGSBZST17 fatcat:zwajkq22wrdxrgw2vw7xv36dwe