A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
2020
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Joint entity and relation extraction is critical for many natural language processing (NLP) tasks, which has attracted increasing research interest. However, it is still faced with the challenges of identifying the overlapping relation triplets along with the entire entity boundary and detecting the multi-type relations. In this paper, we propose an attention-based joint model, which mainly contains an entity extraction module and a relation detection module, to address the challenges. The key
doi:10.24963/ijcai.2020/520
dblp:conf/ijcai/LinLLXLZ20
fatcat:4lyvgwpz6zeyhafipfpkuizlhe