11,888 Hits in 9.0 sec

Context-Aware Smoothing for Neural Machine Translation

Kehai Chen, Rui Wang, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao
2017 International Joint Conference on Natural Language Processing  
In Neural Machine Translation (NMT), each word is represented as a lowdimension, real-value vector for encoding its syntax and semantic information.  ...  The learned context-aware representation is integrated into the NMT to improve the translation performance.  ...  Acknowledgments We are grateful to the anonymous reviewers for their insightful comments and suggestions. This  ... 
dblp:conf/ijcnlp/ChenWUSZ17 fatcat:3ghfbzknf5hxjevhlbr3sk7lru

Smooth Bilingual N-Gram Translation

Holger Schwenk, Marta R. Costa-jussà, José A. R. Fonollosa
2007 Conference on Empirical Methods in Natural Language Processing  
We address the problem of smoothing translation probabilities in a bilingual N-grambased statistical machine translation system.  ...  A neural network is used to perform the projection and the probability estimation. Smoothing probabilities is most important for tasks with a limited amount of training material.  ...  We are only aware of one work that performs a systematic comparison of smoothing techniques in phrase-based machine translation systems (Foster et al., 2006) .  ... 
dblp:conf/emnlp/SchwenkCF07 fatcat:twfkfpiumjfshk6hkrxmlm4i4y

Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation [article]

Dario Stojanovski, Alexander Fraser
2021 arXiv   pre-print
Achieving satisfying performance in machine translation on domains for which there is no training data is challenging.  ...  We obtain improvements for the two zero resource domains we study. We additionally provide an analysis where we vary the amount of context and look at the case where in-domain data is available.  ...  A Large-Scale Test Set for the Evaluation of Context-Aware Pronoun Translation in Neural Machine Translation.  ... 
arXiv:2004.14927v2 fatcat:mtlhfxjwzbakjiko32rvzholru

Context-aware Decoder for Neural Machine Translation using a Target-side Document-Level Language Model [article]

Amane Sugiyama, Naoki Yoshinaga
2021 arXiv   pre-print
Although many context-aware neural machine translation models have been proposed to incorporate contexts in translation, most of those models are trained end-to-end on parallel documents aligned in sentence-level  ...  We show the effectiveness of our approach in three language pairs, English to French, English to Russian, and Japanese to English, by evaluation in bleu and contrastive tests for context-aware translation  ...  We also thank Masato Neishi for technical advice on implementations of neural machine translation.  ... 
arXiv:2010.12827v2 fatcat:z7hwi6seqbht7l4psovmyq4iem

Morphology-aware Word-Segmentation in Dialectal Arabic Adaptation of Neural Machine Translation

Ahmed Tawfik, Mahitab Emam, Khaled Essam, Robert Nabil, Hany Hassan
2019 Proceedings of the Fourth Arabic Natural Language Processing Workshop  
Parallel corpora available for building machine translation (MT) models for dialectal Arabic (DA) are rather limited.  ...  A set of experiments conducted on Egyptian Arabic (EA), Levantine Arabic (LA), and Gulf Arabic (GA) show that a sufficiently accurate morphology-aware segmentation used in conjunction with BPE or SR outperforms  ...  Section 3 reviews the neural machine translation approach that we use to train and adapt translation models for dialectal Arabic.  ... 
doi:10.18653/v1/w19-4602 dblp:conf/wanlp/TawfikEENH19 fatcat:hrnonaofrjdyfnumtndaejywte

Combining Local and Document-Level Context: The LMU Munich Neural Machine Translation System at WMT19

Dario Stojanovski, Alexander Fraser
2019 Proceedings of the Fourth Conference on Machine Translation (Volume 2: Shared Task Papers, Day 1)  
We describe LMU Munich's machine translation system for English→German translation which was used to participate in the WMT19 shared task on supervised news translation.  ...  The system used as a primary submission is a context-aware Transformer capable of both rich modeling of limited contextual information and integration of large-scale document-level context with a less  ...  Acknowledgments We would like to thank the anonymous reviewers for their valuable input.  ... 
doi:10.18653/v1/w19-5345 dblp:conf/wmt/StojanovskiF19 fatcat:ssuliypx4nf3pemceivc6ggr3q

Multi-Domain Neural Machine Translation with Word-Level Adaptive Layer-wise Domain Mixing [article]

Haoming Jiang, Chen Liang, Chong Wang, Tuo Zhao
2021 arXiv   pre-print
Many multi-domain neural machine translation (NMT) models achieve knowledge transfer by enforcing one encoder to learn shared embedding across domains.  ...  To overcome this limitation, we propose a novel multi-domain NMT model using individual modules for each domain, on which we apply word-level, adaptive and layer-wise domain mixing.  ...  Introduction Neural Machine Translation (NMT) has made significant progress in various machine translation tasks (Kalchbrenner and Blunsom, 2013; Sutskever et al., 2014; Bahdanau et al., 2014; Luong et  ... 
arXiv:1911.02692v3 fatcat:cfg7bygjqfeozbt3k5iig7wlny

Isochrony-Aware Neural Machine Translation for Automatic Dubbing [article]

Derek Tam, Surafel M. Lakew, Yogesh Virkar, Prashant Mathur, Marcello Federico
2022 arXiv   pre-print
We introduce the task of isochrony-aware machine translation which aims at generating translations suitable for dubbing.  ...  In this work, we propose implicit and explicit modeling approaches to integrate isochrony information into neural machine translation.  ...  Isochrony-Aware Machine Translation The task of IAMT involves translating sentences in source language containing pause markers correctly to the target language, which includes 1) projection of the pause  ... 
arXiv:2112.08548v2 fatcat:hnebsendp5grdkx5jtm4ushsvu

Character-based Neural Machine Translation [article]

Marta R. Costa-Jussà, José A. R. Fonollosa
2016 arXiv   pre-print
Neural Machine Translation (MT) has reached state-of-the-art results.  ...  The resulting unlimited-vocabulary and affix-aware source word embeddings are tested in a state-of-the-art neural MT based on an attention-based bidirectional recurrent neural network.  ...  Character-based Machine Translation Word embeddings have been shown to boost the performance in many NLP tasks, including machine translation.  ... 
arXiv:1603.00810v3 fatcat:iqdsozmxtfd4hf3fnqarjkv7ou

Constraint Translation Candidates: A Bridge between Neural Query Translation and Cross-lingual Information Retrieval [article]

Tianchi Bi and Liang Yao and Baosong Yang and Haibo Zhang and Weihua Luo and Boxing Chen
2020 arXiv   pre-print
With the help of deep learning, neural machine translation (NMT) has shown promising results on various tasks.  ...  The two shortages of QT result in readable texts for human but inadequate candidates for the downstream retrieval task.  ...  BACKGROUND 2.1 Neural Machine Translation Neural machine translation (NMT) [1, 17] is a recently proposed approach to machine translation which builds a single neural network that takes a source sentence  ... 
arXiv:2010.13658v1 fatcat:voprjm6qnnhgxme2s5bs2xkfje

Stable and Effective Trainable Greedy Decoding for Sequence to Sequence Learning

Yun Chen, Kyunghyun Cho, Samuel R. Bowman, Victor O. K. Li
2018 International Conference on Learning Representations  
We propose a small neural network actor that observes and manipulates the hidden state of a previously-trained decoder. We evaluate our model on the task of neural machine translation.  ...  Experiments on several datasets and models show that our method yields substantial improvements in both translation quality and translation speed over its base system, with no additional data.  ...  We demonstrate this for neural machine translation on three state-of-the-art architectures and two corpora.  ... 
dblp:conf/iclr/ChenCBL18 fatcat:krxakzepmfas5cj4zd4qimern4

Converting Continuous-Space Language Models intoN-gram Language Models with Efficient Bilingual Pruning for Statistical Machine Translation

Rui Wang, Masao Utiyama, Isao Goto, Eiichiro Sumita, Hai Zhao, Bao-Liang Lu
2016 ACM Transactions on Asian and Low-Resource Language Information Processing  
Neural network language models, or continuous-space language models (CSLMs), have been shown to improve the performance of statistical machine translation (SMT) when they are used for reranking n-best  ...  translations.  ...  Acknowledgments We appreciate the helpful discussion with Andrew Finch and Paul Dixon, and three anonymous reviewers for many invaluable comments and suggestions to improve our paper.  ... 
doi:10.1145/2843942 fatcat:e4mxvxc3fravdohnliamixrgfe

Converting Continuous-Space Language Models into N-Gram Language Models for Statistical Machine Translation

Rui Wang, Masao Utiyama, Isao Goto, Eiichiro Sumita, Hai Zhao, Bao-Liang Lu
2013 Conference on Empirical Methods in Natural Language Processing  
Neural network language models, or continuous-space language models (CSLMs), have been shown to improve the performance of statistical machine translation (SMT) when they are used for reranking n-best  ...  translations.  ...  Acknowledgments We appreciate the helpful discussion with Andrew Finch and Paul Dixon, and three anonymous reviewers for many invaluable comments and suggestions to improve our paper.  ... 
dblp:conf/emnlp/WangUGSZL13 fatcat:hehqnne72rhenh7lotlayb2f4m

Urdu-English Machine Transliteration using Neural Networks [article]

Usman Mohy ud Din
2020 arXiv   pre-print
This approach is tested on three models of statistical machine translation (SMT) which include phrasebased, hierarchical phrase-based and factor based models and two models of neural machine translation  ...  Despite significant progress in domain of machine translation, translation of out-of-vocabulary words(OOV) which include technical terms, named-entities, foreign words are still a challenge for current  ...  Syntactic & Semantic Challenges Urdu to English machine translation has some unique challenges which we have to overcome to produce reliable and context-aware translation system [32] .  ... 
arXiv:2001.05296v1 fatcat:5zgacud2dfg5jiuzock7x5l4tq

Neural Machine Translation with Reordering Embeddings

Kehai Chen, Rui Wang, Masao Utiyama, Eiichiro Sumita
2019 Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics  
Improved neural machine translation with a syntax-aware encoder and decoder.  ...  The reordering model plays an important role in phrase-based statistical machine translation. However, there are few works that exploit the reordering information in neural machine translation.  ...  Rui Wang was partially supported by JSPS grant-in-aid for early-career scientists (19K20354): "Unsupervised Neural Machine Translation in Universal Scenarios" and NICT tenure-track researcher startup fund  ... 
doi:10.18653/v1/p19-1174 dblp:conf/acl/ChenWUS19 fatcat:d3dmjikazfd7dgz73acinb3j2y
« Previous Showing results 1 — 15 out of 11,888 results