Filters








3 Hits in 2.0 sec

xMoCo: Cross Momentum Contrastive Learning for Open-Domain Question Answering

Nan Yang, Furu Wei, Binxing Jiao, Daxing Jiang, Linjun Yang
2021 Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)   unpublished
In this paper, we propose a new contrastive learning method called cross momentum contrastive learning (xMoCo), for learning a dualencoder model for query-passage matching.  ...  Dense passage retrieval has been shown to be an effective approach for information retrieval tasks such as open domain question answering.  ...  Conclusion In this paper, we propose cross momentum contrastive learning (xMoCo), for passage retrieval task in open domain QA. xMoCo jointly optimizes question-to-passage and passage-to-question matching  ... 
doi:10.18653/v1/2021.acl-long.477 fatcat:hez24b6rz5cqjczwzmzvdlob6m

Aligning Cross-lingual Sentence Representations with Dual Momentum Contrast [article]

Liang Wang, Wei Zhao, Jingming Liu
2021 arXiv   pre-print
In this paper, we propose to align sentence representations from different languages into a unified embedding space, where semantic similarities (both cross-lingual and monolingual) can be computed with  ...  Acknowledgements We would like to thank three anonymous reviewers for their valuable comments, and EMNLP 2021 organizers for their efforts.  ...  We start with the default hyperparameters from MoCo and use grid search to find the optimal values for several hyperparameters. The specific search ranges are {10 −5 , 2 × 10 −5 ,  ... 
arXiv:2109.00253v1 fatcat:f3vubvdt7nfq5kxb2buvn73qhq

Aligning Cross-lingual Sentence Representations with Dual Momentum Contrast

Liang Wang, Wei Zhao, Jingming Liu
2021 Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing   unpublished
In this paper, we propose to align sentence representations from different languages into a unified embedding space, where semantic similarities (both cross-lingual and monolingual) can be computed with  ...  Acknowledgements We would like to thank three anonymous reviewers for their valuable comments, and EMNLP 2021 organizers for their efforts.  ...  We start with the default hyperparameters from MoCo and use grid search to find the optimal values for several hyperparameters. The specific search ranges are {10 −5 , 2 × 10 −5 ,  ... 
doi:10.18653/v1/2021.emnlp-main.309 fatcat:t43xg3cacvazvompbnwms4nikq