Combining Distant and Direct Supervision for Neural Relation Extraction

Iz Beltagy, Kyle Lo, Waleed Ammar
2019 Proceedings of the 2019 Conference of the North  
In relation extraction with distant supervision, noisy labels make it difficult to train quality models. Previous neural models addressed this problem using an attention mechanism that attends to sentences that are likely to express the relations. We improve such models by combining the distant supervision data with an additional directly-supervised data, which we use as supervision for the attention weights. We find that joint training on both types of supervision leads to a better model
more » ... e it improves the model's ability to identify noisy sentences. In addition, we find that sigmoidal attention weights with max pooling achieves better performance over the commonly used weighted average attention in this setup. Our proposed method 1 achieves a new state-of-theart result on the widely used FB-NYT dataset.
doi:10.18653/v1/n19-1184 dblp:conf/naacl/BeltagyLA19 fatcat:xfnyer4tojcmhgf4zp3tutnhdm