A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Graph-to-Graph Transformer for Transition-based Dependency Parsing
2020
Findings of the Association for Computational Linguistics: EMNLP 2020
unpublished
We propose the Graph2Graph Transformer architecture for conditioning on and predicting arbitrary graphs, and apply it to the challenging task of transition-based dependency parsing. After proposing two novel Transformer models of transition-based dependency parsing as strong baselines, we show that adding the proposed mechanisms for conditioning on and predicting graphs of Graph2Graph Transformer results in significant improvements, both with and without BERT pre-training. The novel baselines
doi:10.18653/v1/2020.findings-emnlp.294
fatcat:cjfvgchamnafndpwst4z5qhbhi