Bringing Order to Neural Word Embeddings with Embeddings Augmented by Random Permutations (

Trevor Cohen, Dominic Widdows
2018 Proceedings of the 22nd Conference on Computational Natural Language Learning  
Word order is clearly a vital part of human language, but it has been used comparatively lightly in distributional vector models. This paper presents a new method for incorporating word order information into word vector embedding models by combining the benefits of permutation-based order encoding with the more recent method of skip-gram with negative sampling. The new method introduced here is called Embeddings Augmented by Random Permutations (EARP). It operates by applying permutations to
more » ... e coordinates of context vector representations during the process of training. Results show an 8% improvement in accuracy on the challenging Bigger Analogy Test Set, and smaller but consistent improvements on other analogy reference sets. These findings demonstrate the importance of order-based information in analogical retrieval tasks, and the utility of random permutations as a means to augment neural embeddings.
doi:10.18653/v1/k18-1045 dblp:conf/conll/CohenW18 fatcat:qvjqn7ryubeazfk4cxzj6btzr4