A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Multichannel Generative Language Model: Learning All Possible Factorizations Within and Across Channels
2020
Findings of the Association for Computational Linguistics: EMNLP 2020
unpublished
A channel corresponds to a viewpoint or transformation of an underlying meaning. A pair of parallel sentences in English and French express the same underlying meaning, but through two separate channels corresponding to their languages. In this work, we present the Multichannel Generative Language Model (MGLM). MGLM is a generative joint distribution model over channels. MGLM marginalizes over all possible factorizations within and across all channels. MGLM endows flexible inference, including
doi:10.18653/v1/2020.findings-emnlp.376
fatcat:jxf6w5oxmrfilcfrrj3z4vt3ia