Learning Neural Templates for Text Generation

Sam Wiseman, Stuart Shieber, Alexander Rush
2018 Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing  
While neural, encoder-decoder models have had significant empirical success in text generation, there remain several unaddressed problems with this style of generation. Encoderdecoder models are largely (a) uninterpretable, and (b) difficult to control in terms of their phrasing or content. This work proposes a neural generation system using a hidden semimarkov model (HSMM) decoder, which learns latent, discrete templates jointly with learning to generate. We show that this model learns useful
more » ... emplates, and that these templates make generation both more interpretable and controllable. Furthermore, we show that this approach scales to real data sets and achieves strong performance nearing that of encoderdecoder text generation models. Source Entity: Cotto type[coffee shop], rating[3 out of 5], food[English], area[city centre], price[moderate], near[The Portland Arms]
doi:10.18653/v1/d18-1356 dblp:conf/emnlp/WisemanSR18 fatcat:gdlefutuc5go5mysbhluxxkdbi