Robust and Efficient Chinese Word Dependency Analysis with Linear Kernel Support Vector Machines
International Conference on Computational Linguistics
Data-driven learning based on shift reduce parsing algorithms has emerged dependency parsing and shown excellent performance to many Treebanks. In this paper, we investigate the extension of those methods while considerably improved the runtime and training time efficiency via L 2 -SVMs. We also present several properties and constraints to enhance the parser completeness in runtime. We further integrate root-level and bottom-level syntactic information by using sequential taggers. The
... tal results show the positive effect of the root-level and bottom-level features that improve our parser from 81.17% to 81.41% and 81.16% to 81.57% labeled attachment scores with modified Yamada's and Nivre's method, respectively on the Chinese Treebank. In comparison to well-known parsers, such as Malt-Parser (80.74%) and MSTParser (78.08%), our methods produce not only better accuracy, but also drastically reduced testing time in 0.07 and 0.11, respectively.