LSTM Shift-Reduce CCG Parsing

Wenduan Xu
2016 Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing  
We describe a neural shift-reduce parsing model for CCG, factored into four unidirectional LSTMs and one bidirectional LSTM. This factorization allows the linearization of the complete parsing history, and results in a highly accurate greedy parser that outperforms all previous beam-search shift-reduce parsers for CCG. By further deriving a globally optimized model using a task-based loss, we improve over the state of the art by up to 2.67% labeled F1.
doi:10.18653/v1/d16-1181 dblp:conf/emnlp/Xu16 fatcat:raympzlaf5fdhltqx7jaeh7mwy