A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
How Self-Attention Improves Rare Class Performance in a Question-Answering Dialogue Agent
2020
SIGDIAL Conferences
Contextualized language modeling using deep Transformer networks has been applied to a variety of natural language processing tasks with remarkable success. However, we find that these models are not a panacea for a questionanswering dialogue agent corpus task, which has hundreds of classes in a long-tailed frequency distribution, with only thousands of data points. Instead, we find substantial improvements in recall and accuracy on rare classes from a simple one-layer RNN with multi-headed
dblp:conf/sigdial/StiffSF20
fatcat:c2bnxajhlfcrbdgohqqffycf6u