Filters








8,052 Hits in 4.1 sec

Generative Adversarial Network-Based Short Sequence Machine Translation from Chinese to English

Wenting Ma, Bing Yan, Lianyue Sun, Baiyuan Ding
2022 Scientific Programming  
Finally, experimental results on English-Chinese translation dataset show that the translation effect is improved by more than 8% compared with the traditional neural machine translation model based on  ...  In this context, the communication between different languages is also closer, so accurate translation between languages is of great significance.  ...  Related Works Based on the coverage idea in statistical machine translation, coverage mechanism is introduced into neural machine translation model in [10] , and coverage vector (CV) is used to store  ... 
doi:10.1155/2022/7700467 fatcat:mz5xwllmdbhvnmskbivuytuf24

A SYSTEMATIC READING IN STATISTICAL TRANSLATION: FROM THE STATISTICAL MACHINE TRANSLATION TO THE NEURAL TRANSLATION MODELS

Zakaria El Maazouzi, Badr Eddine EL Mohajir, Mohammed Al Achhab
2017 Journal of Information and Communication Technology  
Achieving high accuracy in automatic translation tasks has been one of the challenging goals for researchers in the area of machine translation since decades.  ...  Automatic translation as a key application in the natural language processing domain has developed many approaches, namely statistical machine translation and recently neural machine translation that improved  ...  ACKNOWLEDGMENT We would like to acknowledge the National Center for Scientific and Technical Research (CNRST) for supporting this study under the framework of Merit Scholarship Ref: L 02/13.  ... 
doi:10.32890/jict2017.16.2.8239 fatcat:y6rqlyztnnfztofugcqhhu6qci

Neural Machine Translation [article]

Philipp Koehn
2017 arXiv   pre-print
Written as chapter for the textbook Statistical Machine Translation. Used in the JHU Fall 2017 class on machine translation.  ...  Draft of textbook chapter on neural machine translation. a comprehensive treatment of the topic, ranging from introduction to neural networks, computation graphs, description of the currently dominant  ...  Cohn et al. (2016) add a number of biases to model coverage, fertility, and alignment inspired by traditional statistical machine translation models.  ... 
arXiv:1709.07809v1 fatcat:kj23sup7yfaxvllfha4v7xbugq

Coverage for Character Based Neural Machine Translation

M. Bashir Kazimi, Marta R. Costa-jussà
2017 Revista de Procesamiento de Lenguaje Natural (SEPLN)  
In recent years, Neural Machine Translation (NMT) has achieved stateof-the-art performance in translating from a language; source language, to another; target language.  ...  To address this problem, coverage model has been integrated into NMT to keep track of already-translated words and focus on the untranslated ones.  ...  Acknowledgements This work is supported by Ministerio de Economía y Competitividad and Fondo Eu-ropeo de Desarrollo Regional, through contract TEC2015-69266-P (MINECO/FEDER, UE) and the postdoctoral senior  ... 
dblp:journals/pdln/KazimiC17 fatcat:4g3mxoxotjbfrnuoprcq2xhg3y

Neural Machine Translation with Adequacy-Oriented Learning

Xiang Kong, Zhaopeng Tu, Shuming Shi, Eduard Hovy, Tong Zhang
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Although Neural Machine Translation (NMT) models have advanced state-of-the-art performance in machine translation, they face problems like the inadequate translation.  ...  Quantitative and qualitative analyses on different language pairs and NMT architectures demonstrate the effectiveness and universality of the proposed approach.  ...  Introduction During the past several years, rapid progress has been made in the field of Neural Machine Translation (NMT) (Kalchbrenner and Blunsom 2013; Sutskever, Vinyals, and Le 2014; Bahdanau, Cho  ... 
doi:10.1609/aaai.v33i01.33016618 fatcat:lycqyzclf5df3fspr3dhbssh6e

Training and Inference Methods for High-Coverage Neural Machine Translation

Michael Yang, Yixin Liu, Rahul Mayuranath
2020 Proceedings of the Fourth Workshop on Neural Generation and Translation  
In this paper, we introduce a system built for the Duolingo Simultaneous Translation And Paraphrase for Language Education (STA-PLE) shared task at the 4th Workshop on Neural Generation and Translation  ...  For inference, encouraging a small amount of diversity with Diverse Beam Search to improve translation coverage yielded marginal improvement over regular Beam Search.  ...  We introduce a neural machine translation (NMT) system that generates high-coverage translation sets for a single given prompt in the source language.  ... 
doi:10.18653/v1/2020.ngt-1.13 dblp:conf/aclnmt/YangLM20 fatcat:lio3cjlv7vhw5dvqvorbwedzue

Modeling Coverage for Neural Machine Translation [article]

Zhaopeng Tu, Zhengdong Lu, Yang Liu, Xiaohua Liu, Hang Li
2016 arXiv   pre-print
Attention mechanism has enhanced state-of-the-art Neural Machine Translation (NMT) by jointly learning to align and translate.  ...  We maintain a coverage vector to keep track of the attention history.  ...  Yang Liu is supported by the National Natural Science Foundation of China (No. 61522204) and the 863 Program (2015AA011808). We thank the anonymous reviewers for their insightful comments.  ... 
arXiv:1601.04811v6 fatcat:vi65evnubrfezaee34q3py43ta

Analysis of Strategies and Skills of English Translation Based on Coverage Mechanism

Bin Liu, Jing Wang, Le Sun
2022 Computational Intelligence and Neuroscience  
Finally, the dual attention decoding method based on the fusion coverage mechanism is adopted.  ...  In order to alleviate the problem of over translation and missing translation in NMT, based on the consistency and complementarity of information stored in different covering models, a multicoverage fusion  ...  Neural machine translation based on encoder and decoder structure is a general model, which is not fully designed for the machine translation task itself, so there are still some problems to be solved.  ... 
doi:10.1155/2022/7767045 pmid:35909859 pmcid:PMC9334112 fatcat:kw5k3hpuffeyrcv6eywt4q46du

Neural Machine Translation with Adequacy-Oriented Learning [article]

Xiang Kong, Zhaopeng Tu, Shuming Shi, Eduard Hovy, Tong Zhang
2018 arXiv   pre-print
Although Neural Machine Translation (NMT) models have advanced state-of-the-art performance in machine translation, they face problems like the inadequate translation.  ...  Quantitative and qualitative analyses on different language pairs and NMT architectures demonstrate the effectiveness and universality of the proposed approach.  ...  Conclusion In this work, we propose a novel learning approach for RLbased NMT models, which integrates into the policy gradient with an adequacy-oriented reward designed specifically for translation.  ... 
arXiv:1811.08541v1 fatcat:vcnxvmvffbftlka44jy6st4cdy

Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization [article]

Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu, Maosong Sun
2018 arXiv   pre-print
In this work, we propose to use posterior regularization to provide a general framework for integrating prior knowledge into neural machine translation.  ...  We represent prior knowledge sources as features in a log-linear model, which guides the learning process of the neural translation model.  ...  Acknowledgments We thank Shiqi Shen for useful discussions and anonymous reviewers for insightful comments.  ... 
arXiv:1811.01100v1 fatcat:za4zbk7mwva4fem7frhl7roijq

Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization

Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu, Maosong Sun
2017 Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
In this work, we propose to use posterior regularization to provide a general framework for integrating prior knowledge into neural machine translation.  ...  We represent prior knowledge sources as features in a log-linear model, which guides the learning process of the neural translation model.  ...  Acknowledgments We thank Shiqi Shen for useful discussions and anonymous reviewers for insightful comments.  ... 
doi:10.18653/v1/p17-1139 dblp:conf/acl/ZhangLLXS17 fatcat:o5qhg3v4hvhgjcr6tfqzor7gdy

Speeding Up Neural Machine Translation Decoding by Shrinking Run-time Vocabulary

Xing Shi, Kevin Knight
2017 Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)  
On certain low-resource language pairs, the same methods improve BLEU by 0.5 points.  ...  We speed up Neural Machine Translation (NMT) decoding by shrinking run-time target vocabulary. We experiment with two shrinking approaches: Locality Sensitive Hashing (LSH) and word alignments.  ...  For each word f in the source vocabulary of the neural machine translation model, store the top M target words according to P(e|f ) 4.  ... 
doi:10.18653/v1/p17-2091 dblp:conf/acl/ShiK17 fatcat:dvp2majbynfrveuzw44wgvl7ky

Decoding with Value Networks for Neural Machine Translation

Di He, Hanqing Lu, Yingce Xia, Tao Qin, Liwei Wang, Tie-Yan Liu
2017 Neural Information Processing Systems  
Neural Machine Translation (NMT) has become a popular technology in recent years, and beam search is its de facto decoding method due to the shrunk search space and reduced computational complexity.  ...  Experiments show that such an approach can significantly improve the translation accuracy on several translation tasks.  ...  We would like to thank the anonymous reviewers for their valuable comments on our paper.  ... 
dblp:conf/nips/HeLXQ0L17 fatcat:vyhspx77effgzpfzqp5p3wueo4

Coverage Embedding Models for Neural Machine Translation

Haitao Mi, Baskaran Sankaran, Zhiguo Wang, Abe Ittycheriah
2016 Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing  
For each source word, our model starts with a full coverage embedding vector to track the coverage status, and then keeps updating it with neural networks as the translation goes.  ...  In this paper, we enhance the attention-based neural machine translation (NMT) by adding explicit coverage embedding models to alleviate issues of repeating and dropping translations in NMT.  ...  Acknowledgment We thank reviewers for their useful comments.  ... 
doi:10.18653/v1/d16-1096 dblp:conf/emnlp/MiSWI16 fatcat:5z2it3v2trgfjafwlb256hqtly

Morphology-aware Word-Segmentation in Dialectal Arabic Adaptation of Neural Machine Translation

Ahmed Tawfik, Mahitab Emam, Khaled Essam, Robert Nabil, Hany Hassan
2019 Proceedings of the Fourth Arabic Natural Language Processing Workshop  
Parallel corpora available for building machine translation (MT) models for dialectal Arabic (DA) are rather limited.  ...  The scarcity of resources has prompted the use of Modern Standard Arabic (MSA) abundant resources to complement the limited dialectal resource. However, clitics often differ between MSA and DA.  ...  Section 3 reviews the neural machine translation approach that we use to train and adapt translation models for dialectal Arabic.  ... 
doi:10.18653/v1/w19-4602 dblp:conf/wanlp/TawfikEENH19 fatcat:hrnonaofrjdyfnumtndaejywte
« Previous Showing results 1 — 15 out of 8,052 results