Filters








33,660 Hits in 4.7 sec

Multimodal Machine Translation

Jiatong Liu
2021 IEEE Access  
The results show that the model can significantly improve the quality of multimodal neural network machine translation, which also verifies the importance of integrating external knowledge and visual text  ...  Besides, the decoder decodes and generates a translation based on the image and text representation of the source.  ...  can significantly improve the quality of multimodal neural network machine translation.  ... 
doi:10.1109/access.2021.3115135 fatcat:d2anaeg3qnarpfmlc4eap2urrm

Visually Grounded Word Embeddings and Richer Visual Features for Improving Multimodal Neural Machine Translation

Jean-Benoit Delbrouck, Stéphane Dupont, Omar Seddati
2017 arXiv   pre-print
In Multimodal Neural Machine Translation (MNMT), a neural model generates a translated sentence that describes an image, given the image itself and one source descriptions in English.  ...  We hypothesize that richer architecture, such as dense captioning models, may be more suitable for MNMT and could lead to improved translations.  ...  Introduction In machine translation, neural networks have attracted a lot of research attention. Recently, the encoder-decoder framework [1] has been largely adopted.  ... 
arXiv:1707.01009v4 fatcat:kvj7idsk2bbexoyldtvwrezhha

Short Sequence Chinese-English Machine Translation Based on Generative Adversarial Networks of Emotion

Hua Wang, Mohamed Abdelaziz
2022 Computational Intelligence and Neuroscience  
How to improve the accuracy of neural machine translation through deep learning technology is the core problem that researchers study.  ...  In this paper, the neural machine translation model based on generative adversarial network is studied to make the translation result of neural network more accurate and three-dimensional.  ...  In addition to the Transformer model, there is still a lot of room for improvement in the neural machine translation model.  ... 
doi:10.1155/2022/3385477 pmid:35685136 pmcid:PMC9173932 fatcat:mfmd6t675vbwhbzjxbhgeurmxm

Encouraging an Appropriate Representation Simplifies Training of Neural Networks [article]

Krisztian Buza
2019 arXiv   pre-print
internal representation may improve the generalization ability of neural networks.  ...  A common assumption about neural networks is that they can learn an appropriate internal representations on their own, see e.g. end-to-end learning. In this work we challenge this assumption.  ...  Buza was supported by Thematic Excellence Programme, Industry and Digitization Subprogramme, NRDI Office, 2019 and received the "Profes-  ... 
arXiv:1911.07245v1 fatcat:wabck324m5b7toq6e3mnnm2pbq

Neural Machine Translation for Cross-Lingual Pronoun Prediction

Sébastien Jean, Stanislas Lauly, Orhan Firat, Kyunghyun Cho
2017 Proceedings of the Third Workshop on Discourse in Machine Translation  
For all four language pairs, we trained a standard attention-based neural machine translation system as well as three variants that incorporate information from the preceding source sentence.  ...  We show that our systems, which are not specifically designed for pronoun prediction and may be used to generate complete sentence translations, generally achieve competitive results on this task. * This  ...  Acknowledgments This work was supported by Samsung Electronics ("Larger-Context Neural Machine Translation" and "Next Generation Deep Learning: from pattern recognition to AI").  ... 
doi:10.18653/v1/w17-4806 dblp:conf/discomt/JeanLFC17 fatcat:vr4aci5vr5hcvmo54witfr3kxa

Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation [article]

Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, Yoshua Bengio
2014 arXiv   pre-print
The performance of a statistical machine translation system is empirically found to improve by using the conditional probabilities of phrase pairs computed by the RNN Encoder-Decoder as an additional feature  ...  One RNN encodes a sequence of symbols into a fixed-length vector representation, and the other decodes the representation into another sequence of symbols.  ...  Furthermore, in Word and Phrase Representations Since the proposed RNN Encoder-Decoder is not specifically designed only for the task of machine translation, here we briefly look at the properties of  ... 
arXiv:1406.1078v3 fatcat:5gl2ci3wbnagzgbe5mtlqh6guu

A Comparative Study of Text Genres in English-Chinese Translation Effects Based on Deep Learning LSTM

Xiaoda Zhao, Xiaoyan Jin, Naeem Jan
2022 Computational and Mathematical Methods in Medicine  
Simultaneously, grammar knowledge is essential for translation, as it aids in the grammatical representation of word sequences and reduces grammatical errors.  ...  In recent years, neural network-based English-Chinese translation models have gradually supplanted traditional translation methods.  ...  graph representation by potential graph parsing, and using source and target-side dependency tree to improve neural machine translation.  ... 
doi:10.1155/2022/7068406 pmid:35693269 pmcid:PMC9184169 fatcat:wy7tzar6end7hndajf25octb2q

Encouraging an appropriate representation simplifies training of neural networks

Krisztian Buza
2020 Acta Universitatis Sapientiae: Informatica  
internal representation may improve the generalization ability of neural networks.  ...  AbstractA common assumption about neural networks is that they can learn an appropriate internal representation on their own, see e.g. end-to-end learning. In this work we challenge this assumption.  ...  ED 18-1-2019-0030 (Application domain specific highly reliable IT solutions subprogramme) has been implemented with the support provided from the National Research, Development and Innovation Fund of Hungary  ... 
doi:10.2478/ausi-2020-0007 fatcat:tqlejerk5zg2hajn6hgj6dquji

Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation

Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, Yoshua Bengio
2014 Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)  
The performance of a statistical machine translation system is empirically found to improve by using the conditional probabilities of phrase pairs computed by the RNN Encoder-Decoder as an additional feature  ...  One RNN encodes a sequence of symbols into a fixedlength vector representation, and the other decodes the representation into another sequence of symbols.  ...  Furthermore, in Word and Phrase Representations Since the proposed RNN Encoder-Decoder is not specifically designed only for the task of machine translation, here we briefly look at the properties of  ... 
doi:10.3115/v1/d14-1179 dblp:conf/emnlp/ChoMGBBSB14 fatcat:uiy743kyojcknh7pjgs4x33osa

Encoders Help You Disambiguate Word Senses in Neural Machine Translation [article]

Gongbo Tang and Rico Sennrich and Joakim Nivre
2019 arXiv   pre-print
Neural machine translation (NMT) has achieved new state-of-the-art performance in translating ambiguous words. However, it is still unclear which component dominates the process of disambiguation.  ...  We train a classifier to predict whether a translation is correct given the representation of an ambiguous noun.  ...  Neural machine translation by jointly ing Representations, San Diego, California, USA. learning to align and translate.  ... 
arXiv:1908.11771v1 fatcat:madqd2eqbnfinabvhs2etkuvfy

Research on Neural Machine Translation Model

Mengyao Chen, Yong Li, Runqi Li
2019 Journal of Physics, Conference Series  
modeling and transduction problems for a long time, such as language modeling and machine translation.  ...  In neural machine translation (NMT), cyclic neural networks, especially long-term and short-term memory networks and gated recurrent neural networks, have been regarded as the latest methods for sequence  ...  Acknowledgments we express our sincere gratitude to Teacher Li Yong for his help in the process of writing the thesis.  ... 
doi:10.1088/1742-6596/1237/5/052020 fatcat:nghf3oryznatboysa2t4xswlmu

Toward English-Chinese Translation Based on Neural Networks

Fan Xiao, Hasan Ali Khattak
2022 Mobile Information Systems  
Toward this solation, a neural network (NN) based translation approach is proposed to predict word order differences in language translation and improve translation accuracy in long sentences.  ...  Experimental results show that the NN preorder model can significantly improve translation accuracy and system performance.  ...  This is similar to human translation work, that is, first understand the meaning of the source language and then organize the language for translation.  ... 
doi:10.1155/2022/3114123 fatcat:p7bpxsgd2fc4hfe2yfm7rlffha

What do Neural Machine Translation Models Learn about Morphology?

Yonatan Belinkov, Nadir Durrani, Fahim Dalvi, Hassan Sajjad, James Glass
2017 Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
Neural machine translation (MT) models obtain state-of-the-art performance while maintaining a simple, end-to-end architecture.  ...  In this work, we analyze the representations learned by neural MT models at various levels of granularity and empirically evaluate the quality of the representations for learning morphology through extrinsic  ...  Acknowledgments We would like to thank Helmut Schmid for providing the Tiger corpus, members of the MIT Spoken Language Systems group for helpful comments, and the three anonymous reviewers for their useful  ... 
doi:10.18653/v1/p17-1080 dblp:conf/acl/BelinkovDDSG17 fatcat:55clgjs6vnff3irc77aqtomhwu

Research on Machine Translation of Deep Neural Network Learning Model Based on Ontology

Yaya Tian, Shaweta Khanna, Anton Pljonkin
2021 Informatica (Ljubljana, Tiskana izd.)  
The machine translation method based on deep neural network learning can significantly improve the quality and efficiency of translation.  ...  According to the characteristics of Ontology's domain knowledge concept system, deep neural network learning model based machine translation method is proposed.  ...  Acknowledgement Research Center of Shangluo Culture and Jia Pingwa, A Study on the Translation and Introduction of Mo Yan's Works and Jia Pingwa's Works in English-speaking Countries (18SLWH05).  ... 
doi:10.31449/inf.v45i5.3559 fatcat:4rrkga4l2jeerdy7nens6pt7he

Hard but Robust, Easy but Sensitive: How Encoder and Decoder Perform in Neural Machine Translation [article]

Tianyu He, Xu Tan, Tao Qin
2019 arXiv   pre-print
Neural machine translation (NMT) typically adopts the encoder-decoder framework.  ...  A good understanding of the characteristics and functionalities of the encoder and decoder can help to explain the pros and cons of the framework, and design better models for NMT.  ...  Yanzhuo Ding, Yang Liu, Huanbo Luan, and Maosong Sun. 2017. Visualizing and understanding neural machine translation.  ... 
arXiv:1908.06259v1 fatcat:6bvoj2conff5dft2gckrab2xrm
« Previous Showing results 1 — 15 out of 33,660 results