Recurrent models and lower bounds for projective syntactic decoding

Natalie Schluter
2019 Proceedings of the 2019 Conference of the North  
The current state-of-the-art in neural graphbased parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shiftreduce and graph-based parsers, projective or not. We also provide the first proof on the lower bounds of projective maximum spanning tree, DAG, and digraph decoding.
doi:10.18653/v1/n19-1022 dblp:conf/naacl/Schluter19 fatcat:evojft4tjje65ahc4f5djtqknq