A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate
2018
Entropy
Neural language models have drawn a lot of attention for their strong ability to predict natural language text. In this paper, we estimate the entropy rate of natural language with state-of-the-art neural language models. To obtain the estimate, we consider the cross entropy, a measure of the prediction accuracy of neural language models, under the theoretically ideal conditions that they are trained with an infinitely large dataset and receive an infinitely long context for prediction. We
doi:10.3390/e20110839
pmid:33266563
fatcat:h6ipfw45qzbknndt2hg4im2fau