Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate

Shuntaro Takahashi, Kumiko Tanaka-Ishii
2018 Entropy  
Neural language models have drawn a lot of attention for their strong ability to predict natural language text. In this paper, we estimate the entropy rate of natural language with state-of-the-art neural language models. To obtain the estimate, we consider the cross entropy, a measure of the prediction accuracy of neural language models, under the theoretically ideal conditions that they are trained with an infinitely large dataset and receive an infinitely long context for prediction. We
more » ... ically verify that the effects of the two parameters, the training data size and context length, on the cross entropy consistently obey a power-law decay with a positive constant for two different state-of-the-art neural language models with different language datasets. Based on the verification, we obtained 1.12 bits per character for English by extrapolating the two parameters to infinity. This result suggests that the upper bound of the entropy rate of natural language is potentially smaller than the previously reported values.
doi:10.3390/e20110839 pmid:33266563 fatcat:h6ipfw45qzbknndt2hg4im2fau