A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Universal Natural Language Processing with Limited Annotations: Try Few-shot Textual Entailment as a Start
[article]
2020
arXiv
pre-print
A standard way to address different NLP problems is by first constructing a problem-specific dataset, then building a model to fit this dataset. To build the ultimate artificial intelligence, we desire a single machine that can handle diverse new problems, for which task-specific annotations are limited. We bring up textual entailment as a unified solver for such NLP problems. However, current research of textual entailment has not spilled much ink on the following questions: (i) How well does
arXiv:2010.02584v1
fatcat:uyhaox2yljaj5bxwnfubmbbxkq