Optimizing Differentiable Relaxations of Coreference Evaluation Metrics

Phong Le, Ivan Titov
2017 Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)  
Coreference evaluation metrics are hard to optimize directly as they are nondifferentiable functions, not easily decomposable into elementary decisions. Consequently, most approaches optimize objectives only indirectly related to the end goal, resulting in suboptimal performance. Instead, we propose a differentiable relaxation that lends itself to gradient-based optimisation, thus bypassing the need for reinforcement learning or heuristic modification of cross-entropy. We show that by modifying
more » ... the training objective of a competitive neural coreference system, we obtain a substantial gain in performance. This suggests that our approach can be regarded as a viable alternative to using reinforcement learning or more computationally expensive imitation learning.
doi:10.18653/v1/k17-1039 dblp:conf/conll/LeT17 fatcat:bpnbbt4unrdhtfiweiyv33xeeu