A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Non-Adversarial Unsupervised Word Translation
[article]
2018
arXiv
pre-print
Unsupervised word translation from non-parallel inter-lingual corpora has attracted much research interest. ...
Extensive experiments on word translation of European and Non-European languages show that our method achieves better performance than recent state-of-the-art deep adversarial approaches and is competitive ...
Conclusions We have presented an effective technique for unsupervised word-to-word translation. Our method is simple and non-adversarial. ...
arXiv:1801.06126v3
fatcat:rtydn7eq2vb6dagahfmht6abrq
Non-Adversarial Unsupervised Word Translation
2018
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Unsupervised word translation from nonparallel inter-lingual corpora has attracted much research interest. ...
Extensive experiments on word translation of European and Non-European languages show that our method achieves better performance than recent state-of-the-art deep adversarial approaches and is competitive ...
Conclusions We have presented an effective technique for unsupervised word-to-word translation. Our method is simple and non-adversarial. ...
doi:10.18653/v1/d18-1043
dblp:conf/emnlp/HoshenW18
fatcat:26zzwfauyfflnbrndveqvateha
Learning Unsupervised Word Translations Without Adversaries
2018
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Recent research has shown that word translation can be achieved in an unsupervised manner, without parallel seed dictionaries or aligned corpora. ...
Our method performs comparably to adversarial alternatives and outperforms prior non-adversarial methods. ...
Our method: (i) achieves similar unsupervised translation performance to recent adversarial methods, while being significantly easier to train and (ii) clearly outperforms prior non-adversarial methods ...
doi:10.18653/v1/d18-1063
dblp:conf/emnlp/MukherjeeYH18
fatcat:i36jnvfdqvakvc23tescx54cdi
Revisiting Adversarial Autoencoder for Unsupervised Word Translation with Cycle Consistency and Improved Training
[article]
2019
arXiv
pre-print
In this work, we revisit adversarial autoencoder for unsupervised word translation and propose two novel extensions to it that yield more stable training and improved results. ...
Extensive experimentations with European, non-European and low-resource languages show that our method is more robust and achieves better performance than recently proposed adversarial and non-adversarial ...
Figure 1 : 1 Our proposed adversarial autoencoder framework for unsupervised word translation. ...
arXiv:1904.04116v1
fatcat:zos7hdfpmjb63gvkoc7bbk355y
Revisiting Adversarial Autoencoder for Unsupervised Word Translation with Cycle Consistency and Improved Training
2019
Proceedings of the 2019 Conference of the North
In this work, we revisit adversarial autoencoder for unsupervised word translation and propose two novel extensions to it that yield more stable training and improved results. ...
However, recent work has shown superior performance for non-adversarial methods in more challenging language pairs. ...
After discussing related work in Section 2, we present our unsupervised word translation approach with adversarial autoencoder in Section 3. ...
doi:10.18653/v1/n19-1386
dblp:conf/naacl/MohiuddinJ19
fatcat:lf2rnl3aa5hjpdvfqjxr76xkoy
Unsupervised Cross-Lingual Representation Learning
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts
only dozens of word translation pairs) or no supervision at all (fully unsupervised models). 1 Such resourcelight unsupervised methods are based on the assumption that monolingual word vector spaces are ...
Part IV: Non-Adversarial Seed Induction In the next part, we will present several nonadversarial alternatives for unsupervised seed induction based on convex relaxations, point set registration methods ...
Non-adversarial unsupervised word translation. In Proceedings of EMNLP, pages 469-478. ...
doi:10.18653/v1/p19-4007
dblp:conf/acl/RuderSV19
fatcat:khz7rqq3kzaojjssfdvkiqv3ma
Duality Regularization for Unsupervised Bilingual Lexicon Induction
[article]
2019
arXiv
pre-print
Unsupervised bilingual lexicon induction naturally exhibits duality, which results from symmetry in back-translation. For example, EN-IT and IT-EN induction can be mutually primal and dual problems. ...
In this paper, we propose to train primal and dual models jointly, using regularizers to encourage consistency in back translation cycles. ...
Table 3 : 3 Word translation examples for English-Italian on MUSE. Ground truths are marked in bold.
Yedid Hoshen and Lior Wolf. 2018. Non-adversarial unsupervised word translation. ...
arXiv:1909.01013v1
fatcat:nqndyhydtrbabg6uwejs3krqei
Unsupervised Word Translation Pairing using Refinement based Point Set Registration
[article]
2020
arXiv
pre-print
Current unsupervised approaches rely on similarities in geometric structure of word embedding spaces across languages, to learn structure-preserving linear transformations using adversarial networks and ...
This paper proposes BioSpere, a novel framework for unsupervised mapping of bi-lingual word embeddings onto a shared vector space, by combining adversarial initialization and refinement procedure with ...
Unsupervised framework for bi-lingual word alignment was first proposed by Barone (2016) ; Zhang et al. (2017a; using adversarial training. ...
arXiv:2011.13200v1
fatcat:unv63uhosngxdorgcn2ytzo7je
Unsupervised Medical Image Translation Using Cycle-MedGAN
[article]
2019
arXiv
pre-print
The proposed framework utilizes new non-adversarial cycle losses which direct the framework to minimize the textural and perceptual discrepancies in the translated images. ...
On the other hand, unsupervised translation frameworks often result in blurred translated images with unrealistic details. ...
Non-Adversarial Cycle Losses Cycle-GAN relies on the cycle-consistency loss to avoid mismatches which could occur due to unsupervised training using unpaired images. ...
arXiv:1903.03374v1
fatcat:6eittos3tjhbrddjdi4optojqu
Point Set Registration for Unsupervised Bilingual Lexicon Induction
2018
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
By properly adapting a traditional point set registration model to make it be suitable for processing word embeddings, we achieved state-of-the-art performance on the unsupervised bilingual lexicon induction ...
Inspired by the observation that word embeddings exhibit isomorphic structure across languages, we propose a novel method to induce a bilingual lexicon from only two sets of word embeddings, which are ...
A rigid transformation only allows for translation, rotation, and scaling. Affine and non-rigid transformation allows anisotropic scaling and skews. ...
doi:10.24963/ijcai.2018/555
dblp:conf/ijcai/CaoZ18
fatcat:tbmyi53lmjf2npadpdwtqoedy4
Two Way Adversarial Unsupervised Word Translation
[article]
2019
arXiv
pre-print
Word translation is a problem in machine translation that seeks to build models that recover word level correspondence between languages. ...
Recent approaches to this problem have shown that word translation models can learned with very small seeding dictionaries, and even without any starting supervision. ...
Inference Given a mapping W between the word embeddings of a source language and a target language, recovering a translation of a word is a non-trivial problem. ...
arXiv:1912.10168v1
fatcat:by4yp6rrczf2dhlkjibeugxdia
Unsupervised Cross-lingual Transfer of Word Embedding Spaces
[article]
2018
arXiv
pre-print
the back-translation losses. ...
supervised and unsupervised baseline methods over many language pairs. ...
In the unsupervised setting, the goal is to learn the mapping G and F without any paired word translation. ...
arXiv:1809.03633v1
fatcat:ppxcvk5nojdt5jbohu47cfosoq
Unsupervised Cross-lingual Transfer of Word Embedding Spaces
2018
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
supervised and unsupervised baseline methods over many language pairs. ...
This paper proposes an unsupervised learning approach that does not require any cross-lingual labeled data. ...
In the unsupervised setting, the goal is to learn the mapping G and F without any paired word translation. ...
doi:10.18653/v1/d18-1268
dblp:conf/emnlp/XuYOW18
fatcat:qvvwdowh4nhrdnujnvk2sld7jq
LNMap: Departures from Isomorphic Assumption in Bilingual Lexicon Induction Through Non-Linear Mapping in Latent Space
[article]
2020
arXiv
pre-print
Ablation studies show the importance of different model components and the necessity of non-linear mapping. ...
In this work, we propose a novel semi-supervised method to learn cross-lingual word embeddings for BLI. ...
This model is by far the most robust and best performing unsupervised model.
(c) Mohiuddin and Joty (2019) use adversarial autoencoder for unsupervised word translation. ...
arXiv:2004.13889v2
fatcat:qsjnkyu6lnfibk45vrjjpe2o2q
Unsupervised Bilingual Word Embedding Agreement for Unsupervised Neural Machine Translation
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Unsupervised bilingual word embedding (UBWE), together with other technologies such as back-translation and denoising, has helped unsupervised neural machine translation (UNMT) achieve remarkable results ...
A distribution-based model to learn bilingual word embeddings. ...
Rui Wang was partially supported by JSPS grant-in-aid for early-career scientists (19K20354): "Unsupervised Neural Machine Translation in Universal Scenarios" and NICT tenure-track researcher startup fund ...
doi:10.18653/v1/p19-1119
dblp:conf/acl/SunWCUSZ19
fatcat:7bvqumilhne3xo5cefkqbkx5me
« Previous
Showing results 1 — 15 out of 5,744 results