A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Natural Language Statistical Features of LSTM-Generated Texts
2019
IEEE Transactions on Neural Networks and Learning Systems
Long short-term memory (LSTM) networks have recently shown remarkable performance in several tasks that are dealing with natural language generation, such as image captioning or poetry composition. Yet, only few works have analyzed text generated by LSTMs in order to quantitatively evaluate to which extent such artificial texts resemble those generated by humans. We compared the statistical structure of LSTM-generated language to that of written natural language, and to those produced by Markov
doi:10.1109/tnnls.2019.2890970
pmid:30951479
fatcat:arxskczkcvgadn67wqksxe3tjq