Filters








853 Hits in 4.1 sec

Curriculum Learning for Dense Retrieval Distillation [article]

Hansi Zeng, Hamed Zamani, Vishwa Vinay
2022 arXiv   pre-print
Recent work has shown that more effective dense retrieval models can be obtained by distilling ranking knowledge from an existing base re-ranking model.  ...  CL-DRD iteratively optimizes the dense retrieval (student) model by increasing the difficulty of the knowledge distillation data made available to it.  ...  In this paper, we take advantage of this flexibility and introduce a generic curriculum learning framework for training dense retrieval models via knowledge distillation.  ... 
arXiv:2204.13679v1 fatcat:j2r4kcalpbalbh3xk4lp27wv4i

Curriculum Contrastive Context Denoising for Few-shot Conversational Dense Retrieval

Kelong Mao, Zhicheng Dou, Hongjin Qian
2022 Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval  
To address these issues, in this paper, we present a novel Curriculum cOntrastive conTExt Denoising framework, COTED, towards few-shot conversational dense retrieval.  ...  However, enhancing the context denoising ability for conversational search is quite challenging due to data scarcity and the steep difficulty for simultaneously learning conversational query encoding and  ...  The authors would like to thank the reviewers for their valuable comments and thank Liangcai Su and Dr. Jieming Zhu for their helpful discussion.  ... 
doi:10.1145/3477495.3531961 fatcat:lrvqeehqknf3lhc7r3f2ogulqy

Semantic Models for the First-stage Retrieval: A Comprehensive Review [article]

Yinqiong Cai, Yixing Fan, Jiafeng Guo, Fei Sun, Ruqing Zhang, Xueqi Cheng
2021 arXiv   pre-print
We believe it is the right time to survey current status, learn from existing methods, and gain some insights for future development.  ...  Therefore, it has been a long-term desire to build semantic models for the first-stage retrieval that can achieve high recall efficiently.  ...  But, whether the curriculum learning mechanism will help the model optimization for the first-stage retrieval has not been studied.  ... 
arXiv:2103.04831v3 fatcat:6qa7hvc3jve3pcmo2mo4qsiefq

Sparse and Dense Approaches for the Full-rank Retrieval of Responses for Dialogues [article]

Gustavo Penha, Claudia Hauff
2022 arXiv   pre-print
We investigate both dialogue context and response expansion techniques for sparse retrieval, as well as zero-shot and fine-tuned dense retrieval approaches.  ...  Our findings based on three different information-seeking dialogue datasets reveal that a learned response expansion technique is a solid baseline for sparse retrieval.  ...  Penha and Hauff [44] showed a way of using a BERT-based re-ranking model for the dialogues domain, which is improved when notions of difficulty are taken into account in a curriculum learning training  ... 
arXiv:2204.10558v1 fatcat:jbab4qy6rrdetisxxa6cuapbi4

Low-Resource Dense Retrieval for Open-Domain Question Answering: A Comprehensive Survey [article]

Xiaoyu Shen, Svitlana Vakulenko, Marco del Tredici, Gianni Barlacchi, Bill Byrne, Adrià de Gispert
2022 arXiv   pre-print
Dense retrieval (DR) approaches based on powerful pre-trained language models (PLMs) achieved significant advances and have become a key component for modern open-domain question-answering systems.  ...  For every technique, we introduce its general-form algorithm, highlight the open issues and pros and cons. Promising directions are outlined for future research.  ...  Zhou et al. (2022b) applies meta learning to reduce the teacher-student gap. Zeng et al. (2022a) shows that curriculum learning can be used to speed up the learning of the student dense retrieval.  ... 
arXiv:2208.03197v1 fatcat:rmp7mwbh2fhvxansfda5b62sma

Emulating Quantum Dynamics with Neural Networks via Knowledge Distillation [article]

Yu Yao, Chao Cao, Stephan Haas, Mahak Agarwal, Divyam Khanna, Marcin Abram
2022 arXiv   pre-print
Our approach is based on the idea of knowledge distillation and uses elements of curriculum learning. It works by constructing a set of simple, but rich-in-physics training examples (a curriculum).  ...  Here, we introduce an efficient training framework for constructing machine learning-based emulators.  ...  Acknowledgments The authors acknowledge the Center for Advanced Research Computing (CARC) at the University of Southern California for providing computing resources that have contributed to the research  ... 
arXiv:2203.10200v1 fatcat:4v3tujayungjbasv52ioscqjtq

The Inseparable Bond Between Research and Medical Education

Mindy George-Weinstein
2019 The Journal of the American Osteopathic Association  
However, integrating instruction in the principles and practice of research into an already dense curriculum presents significant challenges.  ...  Immersion in team-based and active learning groups leads to the same end of gaining knowledge; however, students determine what information is relevant, retrieve it themselves, and teach their classmates  ... 
doi:10.7556/jaoa.2019.108 pmid:31449300 fatcat:dvhove5t35fflb36jonb55wjte

CARLS: Cross-platform Asynchronous Representation Learning System [article]

Chun-Ta Lu, Yun Zeng, Da-Cheng Juan, Yicheng Fan, Zhe Li, Jan Dlabal, Yi-Ting Chen, Arjun Gopalan, Allan Heydon, Chun-Sung Ferng, Reah Miyara, Ariel Fuxman (+4 others)
2021 arXiv   pre-print
We also describe three learning paradigms -- semi-supervised learning, curriculum learning and multimodal learning -- as examples that can be scaled up efficiently by CARLS.  ...  The proposed CARLS is particularly suitable for learning paradigms where model training benefits from additional knowledge inferred or discovered during training, such as node embeddings for graph neural  ...  Fig. 4 illustrates the workflow of using CARLS for curriculum learning.  ... 
arXiv:2105.12849v1 fatcat:elzzirsqkngxpemgfmw2o6lnzi

12-in-1: Multi-Task Vision and Language Representation Learning

Jiasen Lu, Vedanuj Goswami, Marcus Rohrbach, Devi Parikh, Stefan Lee
2020 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
vision-and-language research focuses on a small but diverse set of independent tasks and supporting datasets often studied in isolation; however, the visuallygrounded language understanding skills required for  ...  Our approach culminates in a single model on 12 datasets from four broad categories of task including visual question answering, caption-based image retrieval, grounding referring expressions, and multi-modal  ...  Curriculum Learning. Inspired by prior multi-task literature [4] [31], we experimented with both curriculum and anti-curriculum strategies based on task difficulty.  ... 
doi:10.1109/cvpr42600.2020.01045 dblp:conf/cvpr/LuGRPL20 fatcat:kmcnv5rwdjcflfgjwqy3cugv7u

Pretrained Transformers for Text Ranking: BERT and Beyond [article]

Jimmy Lin, Rodrigo Nogueira, Andrew Yates
2021 arXiv   pre-print
We cover a wide range of modern techniques, grouped into two high-level categories: transformer models that perform reranking in multi-stage architectures and dense retrieval techniques that perform ranking  ...  The combination of transformers and self-supervised pretraining has been responsible for a paradigm shift in natural language processing (NLP), information retrieval (IR), and beyond.  ...  Special thanks goes out to two anonymous reviewers for their insightful comments and helpful feedback.  ... 
arXiv:2010.06467v3 fatcat:obla6reejzemvlqhvgvj77fgoy

Pretrained Transformers for Text Ranking: BERT and Beyond

Andrew Yates, Rodrigo Nogueira, Jimmy Lin
2021 Proceedings of the 14th ACM International Conference on Web Search and Data Mining  
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in response to a query for a particular task.  ...  The combination of transformers and self-supervised pretraining has, without exaggeration, revolutionized the fields of natural language processing (NLP), information retrieval (IR), and beyond.  ...  We'd like to thank the following people for comments on earlier drafts of this work: Maura Grossman, Sebastian Hofstätter, Xueguang Ma, and Bhaskar Mitra.  ... 
doi:10.1145/3437963.3441667 fatcat:6teqmlndtrgfvk5mneq5l7ecvq

Effective and practical neural ranking

Sean MacAvaney
2021 SIGIR Forum  
Supervised machine learning methods that use neural networks ("deep learning") have yielded substantial improvements to a multitude of Natural Language Processing (NLP) tasks in the past decade.  ...  Improvements to Information Retrieval (IR) tasks, such as ad-hoc search, lagged behind those in similar NLP tasks, despite considerable community efforts.  ...  Table 3 . 3 8 Table of symbols for curriculum learning.  ... 
doi:10.1145/3476415.3476432 fatcat:fdjy53sggvhgxo5fa5hzpede2i

HunYuan_tvr for Text-Video Retrieval [article]

Shaobo Min, Weijie Kong, Rong-Cheng Tu, Dihong Gong, Chengfei Cai, Wenzhe Zhao, Chenyang Liu, Sixiao Zheng, Hongfa Wang, Zhifeng Li, Wei Liu
2022 arXiv   pre-print
Text-Video Retrieval plays an important role in multi-modal understanding and has attracted increasing attention in recent years.  ...  In this way, we can construct hierarchical video representations for frame-clip-video granularities, and also explore word-wise correlations to form word-phrase-sentence embeddings for the text modality  ...  For example, FROZEN [2] treats an image as a single-frame video and designs a curriculum learning schedule to train the model on both image and video datasets.  ... 
arXiv:2204.03382v5 fatcat:uds4wvgtbbbbxi7hrwe5ioyq7u

Mention Memory: incorporating textual knowledge into Transformers through entity mention attention [article]

Michiel de Jong, Yury Zemlyanskiy, Nicholas FitzGerald, Fei Sha, William Cohen
2022 arXiv   pre-print
We also show that the model learns to attend to informative mentions without any direct supervision.  ...  Specifically, our method represents knowledge with 'mention memory', a table of dense vector representations of every entity mention in a corpus.  ...  ACKNOWLEGEMENTS We thank Livio Baldini Soares, Kenton Lee, Tom Kwiatkowski, Ilya Eckstein and others at Google Research for insightful discussions.  ... 
arXiv:2110.06176v2 fatcat:h5uqxyywbfb5xixs23e3y7lasy

Rethinking Search: Making Experts out of Dilettantes [article]

Donald Metzler, Yi Tay, Dara Bahri, Marc Najork
2021 arXiv   pre-print
Classical information retrieval systems do not answer information needs directly, but instead provide references to (hopefully authoritative) answers.  ...  When experiencing an information need, users want to engage with an expert, but often turn to an information retrieval system, such as a search engine, instead.  ...  Distilling Dense (ENNLP ’16). 2383–2392.  ... 
arXiv:2105.02274v1 fatcat:qdghlnv2nnfhnoo6eafdaxqxzy
« Previous Showing results 1 — 15 out of 853 results