Structural Supervision Improves Learning of Non-Local Grammatical Dependencies

Ethan Wilcox, Peng Qian, Richard Futrell, Miguel Ballesteros, Roger Levy
2019 Proceedings of the 2019 Conference of the North  
State-of-the-art LSTM language models trained on large corpora learn sequential contingencies in impressive detail and have been shown to acquire a number of non-local grammatical dependencies with some success. Here we investigate whether supervision with hierarchical structure enhances learning of a range of grammatical dependencies, a question that has previously been addressed only for subject-verb agreement. Using controlled experimental methods from psycholinguistics, we compare the
more » ... mance of word-based LSTM models versus two models that represent hierarchical structure and deploy it in left-to-right processing: Recurrent Neural Network Grammars (RNNGs) (Dyer et al., 2016) and a incrementalized version of the Parsing-as-Language-Modeling configuration from Charniak et al. (2016). Models are tested on a diverse range of configurations for two classes of non-local grammatical dependencies in English-Negative Polarity licensing and Filler-Gap Dependencies. Using the same training data across models, we find that structurally-supervised models outperform the LSTM, with the RNNG demonstrating best results on both types of grammatical dependencies and even learning many of the Island Constraints on the filler-gap dependency. Structural supervision thus provides data efficiency advantages over purely stringbased training of neural language models in acquiring human-like generalizations about non-local grammatical dependencies.
doi:10.18653/v1/n19-1334 dblp:conf/naacl/WilcoxQFBL19 fatcat:y6jn4p72pbdnfnakfpubpg55u4