A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Unsupervised Neural Text Simplification
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
The paper presents a first attempt towards unsupervised neural text simplification that relies only on unlabeled text corpora. The core framework is composed of a shared encoder and a pair of attentional-decoders, crucially assisted by discrimination-based losses and denoising. The framework is trained using unlabeled text collected from en-Wikipedia dump. Our analysis (both quantitative and qualitative involving human evaluators) on public test data shows that the proposed model can perform
doi:10.18653/v1/p19-1198
dblp:conf/acl/SuryaMLJS19
fatcat:mvbgh5vzzndenbx3zx6e63mwpu