Filters








2,759 Hits in 5.7 sec

A Latent Morphology Model for Open-Vocabulary Neural Machine Translation [article]

Duygu Ataman, Wilker Aziz, Alexandra Birch
2020 arXiv   pre-print
Translation into morphologically-rich languages challenges neural machine translation (NMT) models with extremely sparse vocabularies where atomic treatment of surface forms is unrealistic.  ...  In this paper, we propose to translate words by modeling word formation through a hierarchical latent variable model which mimics the process of morphological inflection.  ...  NEURAL MACHINE TRANSLATION In this paper, we use recurrent NMT architectures based on the model developed by .  ... 
arXiv:1910.13890v3 fatcat:ilc4u37nabb4ldzfmoirhxabce

Neural Language Generation: Formulation, Methods, and Evaluation [article]

Cristina Garbacea, Qiaozhu Mei
2020 arXiv   pre-print
Next we include a comprehensive outline of methods and neural architectures employed for generating diverse texts.  ...  Recent advances in neural network-based generative modeling have reignited the hopes in having computer systems capable of seamlessly conversing with humans and able to understand natural language.  ...  Approaches for neural abstractive summarization build upon advances in machine translation.  ... 
arXiv:2007.15780v1 fatcat:oixtreazxvbgvclicpxiqzbxrm

A Hierarchical Latent Vector Model for Learning Long-Term Structure in Music [article]

Adam Roberts, Jesse Engel, Colin Raffel, Curtis Hawthorne, Douglas Eck
2019 arXiv   pre-print
To address this issue, we propose the use of a hierarchical decoder, which first outputs embeddings for subsequences of the input and then uses these embeddings to generate each subsequence independently  ...  The Variational Autoencoder (VAE) has proven to be an effective model for producing semantically meaningful latent representations for natural data.  ...  Neural machine Gulrajani, I., Kumar, K., Ahmed, F., Taiga, A. A., Visin, translation by jointly learning to align and translate. In F., Vazquez, D., and Courville, A.  ... 
arXiv:1803.05428v5 fatcat:cemoujcdxjh3bnc7pp3ztmbafm

Document-Level Neural Machine Translation with Hierarchical Attention Networks [article]

Lesly Miculicich, Dhananjay Ram, Nikolaos Pappas, James Henderson
2018 arXiv   pre-print
Neural Machine Translation (NMT) can be improved by including document-level contextual information.  ...  For this purpose, we propose a hierarchical attention model to capture the context in a structured and dynamic manner.  ...  Acknowledgments We are grateful for the support of the European Union under the Horizon 2020 SUMMA project n. 688139, see www.summa-project.eu.  ... 
arXiv:1809.01576v2 fatcat:scobsungz5gffggndjznqur5fi

Document-Level Neural Machine Translation with Hierarchical Attention Networks

Lesly Miculicich Werlen, Dhananjay Ram, Nikolaos Pappas, James Henderson
2018 Zenodo  
Neural Machine Translation (NMT) can be improved by including document-level contextual information.  ...  For this purpose, we propose a hierarchical attention model to capture the context in a structured and dynamic manner.  ...  Acknowledgments We are grateful for the support of the European Union under the Horizon 2020 SUMMA project n. 688139, see www.summa-project.eu.  ... 
doi:10.5281/zenodo.2276024 fatcat:nckpos6tbja2je5auso2apqoji

Neural Dialogue Generation Methods in Open Domain: A Survey

Bin Sun, Kan Li
2021 Natural Language Processing Research  
., Encoder-Decoder framework-based methods, Hierarchical Recurrent Encoder-Decoder (HRED)-based methods, Variational Autoencoder (VAE)-based methods, Reinforcement Learning (RL)-based methods, Generative  ...  At first stage, many dialogue systems are based on rules and frames, that is, the related keywords are set in advance, and a response framework is designed for these keywords.  ...  ACKNOWLEDGMENTS We are grateful to the anonymous reviewers for their valuable and constructional advices on the previous versions of this article; all remaining errors are our own.  ... 
doi:10.2991/nlpr.d.210223.001 fatcat:mqcjkf7vczfkdjhdtbznupmz2e

Why is constrained neural language generation particularly challenging? [article]

Cristina Garbacea, Qiaozhu Mei
2022 arXiv   pre-print
However, controlling the output of these models for desired user and task needs is still an open challenge.  ...  conditions and constraints (the latter being testable conditions on the output text instead of the input), present constrained text generation tasks, and review existing methods and evaluation metrics for  ...  Variational neural machine translation [163] incorporates a continuous latent variable to model the underlying semantics of sentence pairs.  ... 
arXiv:2206.05395v1 fatcat:nnoqdgda4be45lj5tjo3xt7kbe

A Novel Keyword Generation Model Based on Topic-Aware and Title-Guide

Jialin Ma, Jieyi Cheng, Yue Zhang, Dalin Zhang
2022 Computational Intelligence and Neuroscience  
In the KGM-TT, the neural topic model is used to identify the latent topic words, and a hierarchical encoder technology with attention mechanism is able to encode the title and its content, respectively  ...  In this paper, we introduce a novel Keyword Generation Model based on Topic-aware and Title-guide (KGM-TT).  ...  First, the neural topic model based on the variational autoencoder is used to generate the latent topic of the document and the hierarchical encoder with attention mechanism to encode the corpus; then,  ... 
doi:10.1155/2022/1787369 pmid:35655495 pmcid:PMC9152386 fatcat:63t3fq2hd5ewfdztvut26jxw3i

Convolutional neural network based hierarchical autoencoder for nonlinear mode decomposition of fluid field data [article]

Kai Fukami, Taichi Nakamura, Koji Fukagata
2020 arXiv   pre-print
We propose a customized convolutional neural network based autoencoder called a hierarchical autoencoder, which allows us to extract nonlinear autoencoder modes of flow fields while preserving the contribution  ...  order of the latent vectors.  ...  /Polimi) for stimulating discussions and fruitful comments. DATA AVAILABILITY The data that support the findings of this study are available from the corresponding author upon reasonable request.  ... 
arXiv:2006.06977v2 fatcat:f7rzlnpjcjaejddwx7ziw44ufy

Long and Diverse Text Generation with Planning-based Hierarchical Variational Model

Zhihong Shao, Minlie Huang, Jiangtao Wen, Wenfei Xu, xiaoyan zhu
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
To address these issues, we propose a Planning-based Hierarchical Variational Model (PHVM).  ...  Existing neural methods for data-to-text generation are still struggling to produce long and diverse texts: they are insufficient to model input data dynamically during generation, to capture inter-sentence  ...  Acknowledgements We would like to thank THUNUS NExT Joint-Lab for the support.  ... 
doi:10.18653/v1/d19-1321 dblp:conf/emnlp/ShaoHWXZ19 fatcat:bjpwzuerojgy3o4sr34moxkidu

Sequence-to-Sequence Learning with Latent Neural Grammars [article]

Yoon Kim
2021 arXiv   pre-print
We apply this latent neural grammar to various domains -- a diagnostic language navigation task designed to test for compositional generalization (SCAN), style transfer, and small-scale machine translation  ...  Sequence-to-sequence learning with neural networks has become the de facto standard for sequence prediction tasks.  ...  This is reminiscent of classic hierarchical phrase-based approaches to machine translation where the extracted phrases often do not correspond to linguistic phrases [20] .  ... 
arXiv:2109.01135v7 fatcat:x6rc3nttz5c4jczie27vpdifdm

Text Style Transfer: A Review and Experimental Evaluation [article]

Zhiqiang Hu, Roy Ka-Wei Lee, Charu C. Aggarwal, Aston Zhang
2022 arXiv   pre-print
We review the existing evaluation methodologies for TST tasks and conduct a large-scale reproducibility study where we experimentally benchmark 19 state-of-the-art TST algorithms on two publicly available  ...  Neural Machine Translation Neural machine translation, a deep learning-based approach to machine translation, is a well-studied research area [16, 119, 3] .  ...  Unlike the traditional statistical machine translation techniques [6, 63] , neural machine translation can perform end-to-end training of a machine translation model without the need to deal with word  ... 
arXiv:2010.12742v3 fatcat:y5prl3zlvrea7d26dt5yvqbpvm

A Survey on Dialogue Systems: Recent Advances and New Frontiers [article]

Hongshen Chen, Xiaorui Liu, Dawei Yin, Jiliang Tang
2018 arXiv   pre-print
For dialogue systems, deep learning can leverage a massive amount of data to learn meaningful feature representations and response generation strategies, while requiring a minimum amount of hand-crafting  ...  The success of applying deep learning in machine translation, namely Neural Machine Translation, spurs the enthusiasm of researches in neural generative dialogue systems.  ...  [64] proposed a generative probabilistic model, which is based on phrase-based Statistical Machine Translation [118] , to model conversations on micro-blogging.  ... 
arXiv:1711.01731v3 fatcat:6wuovcynqbhlzmuorchn4mn6ma

Multimodal Data Processing Framework for Smart City: A Positional-attention Based Deep Learning Approach

Qianxia Ma, Yongfang Nie, Jingyan Song, Tao Zhang
2020 IEEE Access  
Encoder-decoder mechanism is the most commonly used neural machine translation architecture before 2015.  ...  In LSTM-LSTM model, for example, the latent state of the decoder h i at time i is computed based on the previous hidden variable h i−1 , the previous output y i−1 and c i .  ... 
doi:10.1109/access.2020.3041447 fatcat:fh3u6z3f65dndaxjfrxiprnhsy

Long and Diverse Text Generation with Planning-based Hierarchical Variational Model [article]

Zhihong Shao, Minlie Huang, Jiangtao Wen, Wenfei Xu, Xiaoyan Zhu
2019 arXiv   pre-print
To address these issues, we propose a Planning-based Hierarchical Variational Model (PHVM).  ...  Existing neural methods for data-to-text generation are still struggling to produce long and diverse texts: they are insufficient to model input data dynamically during generation, to capture inter-sentence  ...  We would like to thank THUNUS NExT Joint-Lab for the support.  ... 
arXiv:1908.06605v2 fatcat:4pv5yb7crzchjour4zrtxv2llm
« Previous Showing results 1 — 15 out of 2,759 results