Filters








5 Hits in 1.2 sec

InferLite: Simple Universal Sentence Representations from Natural Language Inference Data

Jamie Kiros, William Chan
2018 Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing   unpublished
Natural language inference has been shown to be an effective supervised task for learning generic sentence embeddings.  ...  Our models can be trained in under 1 hour on a single GPU and allows for fast inference of new representations.  ...  Introduction Distributed representations of words have become immensely successful as the building blocks for deep neural networks applied to a wide range of natural language processing tasks (Pennington  ... 
doi:10.18653/v1/d18-1524 fatcat:ap6due35jzgynhh7tsmckr3wiq

ACTRCE: Augmenting Experience via Teacher's Advice For Multi-Goal Reinforcement Learning [article]

Harris Chan, Yuhuai Wu, Jamie Kiros, Sanja Fidler, Jimmy Ba
2019 arXiv   pre-print
We present Augmenting experienCe via TeacheR's adviCE (ACTRCE), an efficient reinforcement learning technique that extends the HER framework using natural language as the goal representation.  ...  Despite its effectiveness, HER has limited applicability because it lacks a compact and universal goal representation.  ...  Inferlite: Simple universal sen- tence representations from natural language inference data. In EMNLP, 2018. Kuhlmann, G., Stone, P., Mooney, R., and Shavlik, J.  ... 
arXiv:1902.04546v1 fatcat:et4af4kvyrhlvarncijt4ro7x4

Contextual Lensing of Universal Sentence Representations [article]

Jamie Kiros
2020 arXiv   pre-print
We break the construction of universal sentence vectors into a core, variable length, sentence matrix representation equipped with an adaptable 'lens' from which fixed-length vectors can be induced as  ...  We show that it is possible to focus notions of language similarity into a small number of lens parameters given a core universal matrix representation.  ...  Experimental Contributions • We lens BERT representations to the task of Natural Language Inference (NLI) for downstream English tasks.  ... 
arXiv:2002.08866v1 fatcat:7p3kk2w3ajhvbdtst3kac2fv7y

Context Mover's Distance Barycenters: Optimal Transport of Contexts for Building Representations [article]

Sidak Pal Singh, Andreas Hug, Aymeric Dieuleveut, Martin Jaggi
2020 arXiv   pre-print
This enables us to consider representation learning from the perspective of Optimal Transport and take advantage of its tools such as Wasserstein distance and barycenters.  ...  We elaborate how the method can be applied for obtaining unsupervised representations of text and illustrate the performance (quantitatively as well as qualitatively) on tasks such as measuring sentence  ...  Inferlite: Sim- ple universal sentence representations from natu- ral language inference data.  ... 
arXiv:1808.09663v6 fatcat:sojmvsoyo5d3rgcetfrloae6dm

Learning Compressed Sentence Representations for On-Device Text Processing

Dinghan Shen, Pengyu Cheng, Dhanasekar Sundararaman, Xinyuan Zhang, Qian Yang, Meng Tang, Asli Celikyilmaz, Lawrence Carin
2019 Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics  
Vector representations of sentences, trained on massive text corpora, are widely used as generic sentence embeddings across a variety of NLP problems.  ...  Moreover, with the learned binary representations, the semantic relatedness of two sentences can be evaluated by simply calculating their Hamming distance, which is more computational efficient compared  ...  inference tasks, i.e., Standford Natural Language Inference (SNLI) (Bowman et al., 2015) and Multi-Genre Natural Language Inference (MultiNLI) datasets (Williams et al., 2017) .  ... 
doi:10.18653/v1/p19-1011 dblp:conf/acl/ShenCSZYTCC19 fatcat:34mqfel4snfrtfqpsf4qi43e7e