Zero-Shot Information Extraction as a Unified Text-to-Triple Translation

Chenguang Wang, Xiao Liu, Zui Chen, Haoyun Hong, Jie Tang, Dawn Song
2021 Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing   unpublished
We cast a suite of information extraction tasks into a text-to-triple translation framework. Instead of solving each task relying on taskspecific datasets and models, we formalize the task as a translation between task-specific input text and output triples. By taking the taskspecific input, we enable a task-agnostic translation by leveraging the latent knowledge that a pre-trained language model has about the task. We further demonstrate that a simple pretraining task of predicting which
more » ... onal information corresponds to which input text is an effective way to produce task-specific outputs. This enables the zero-shot transfer of our framework to downstream tasks. We study the zero-shot performance of this framework on open information extraction (OIE2016, NYT, WEB, PENN), relation classification (FewRel and TACRED), and factual probe (Google-RE and T-REx). The model transfers non-trivially to most tasks and is often competitive with a fully supervised method without the need for any task-specific training. For instance, we significantly outperform the F1 score of the supervised open information extraction without needing to use its training set. 1
doi:10.18653/v1/2021.emnlp-main.94 fatcat:tkv47zq7qbertlzueslfxqrsai