Bootstrapping incremental dialogue systems from minimal data: the generalisation power of dialogue grammars

Arash Eshghi, Igor Shalyminov, Oliver Lemon
2017 Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing  
We investigate an end-to-end method for automatically inducing task-based dialogue systems from small amounts of unannotated dialogue data. It combines an incremental semantic grammar -Dynamic Syntax and Type Theory with Records (DS-TTR) -with Reinforcement Learning (RL), where language generation and dialogue management are a joint decision problem. The systems thus produced are incremental: dialogues are processed word-by-word, shown previously to be essential in supporting natural,
more » ... s dialogue. We hypothesised that the rich linguistic knowledge within the grammar should enable a combinatorially large number of dialogue variations to be processed, even when trained on very few dialogues. Our experiments show that our model can process 74% of the Facebook AI bAbI dataset even when trained on only 0.13% of the data (5 dialogues). It can in addition process 65% of bAbI+, a corpus 1 we created by systematically adding incremental dialogue phenomena such as restarts and self-corrections to bAbI. We compare our model with a state-of-theart retrieval model, memn2n (Bordes et al., 2017) . We find that, in terms of semantic accuracy, memn2n shows very poor robustness to the bAbI+ transformations even when trained on the full bAbI dataset.
doi:10.18653/v1/d17-1236 dblp:conf/emnlp/EshghiSL17 fatcat:qi3ncbweqffbjhz4uypgkiirxq