A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
NPRF: A Neural Pseudo Relevance Feedback Framework for Ad-hoc Information Retrieval
2018
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Pseudo relevance feedback (PRF) is commonly used to boost the performance of traditional information retrieval (IR) models by using top-ranked documents to identify and weight new query terms, thereby ...
To bridge this gap, we propose an end-to-end neural PRF framework that can be used with existing neural IR models by embedding different neural models as building blocks. ...
This work proposes an approach for incorporating relevance feedback information by embedding neural IR models within a neural pseudo relevance feedback framework, where the models consume feedback information ...
doi:10.18653/v1/d18-1478
dblp:conf/emnlp/LiSHWHYSX18
fatcat:jxizmm2nc5glldd4e5afq2pzbm
NPRF: A Neural Pseudo Relevance Feedback Framework for Ad-hoc Information Retrieval
[article]
2018
arXiv
pre-print
Pseudo-relevance feedback (PRF) is commonly used to boost the performance of traditional information retrieval (IR) models by using top-ranked documents to identify and weight new query terms, thereby ...
To bridge this gap, we propose an end-to-end neural PRF framework that can be used with existing neural IR models by embedding different neural models as building blocks. ...
This work proposes an approach for incorporating relevance feedback information by embedding neural IR models within a neural pseudo relevance feedback framework, where the models consume feedback information ...
arXiv:1810.12936v1
fatcat:lkhtueooarbijbygstqmkzug7m
Sequence-to-Set Semantic Tagging: End-to-End Multi-label Prediction using Neural Attention for Complex Query Reformulation and Automated Text Categorization
[article]
2019
arXiv
pre-print
expansion task for the TREC CDS 2016 challenge dataset when evaluated on an Okapi BM25--based document retrieval system; and also over the MLTM baseline (Soleimani et al, 2016), for both supervised and ...
Therefore, inspired by the recent success of sequence-to-sequence neural models in delivering the state-of-the-art in a wide range of NLP tasks, we develop a novel sequence-to-set framework with neural ...
in an end-to-end neural framework. ...
arXiv:1911.04427v1
fatcat:gb7fiztkgveuhnsit2x6zh57hy
Implicit-Explicit Representations for Case-Based Retrieval
2018
Biennial Conference on Design of Experimental Search & Information Retrieval Systems
documents to improve medical case-based retrieval. ...
We propose an IR framework to combine the implicit representations -identified using distributional representation techniquesand the explicit representations -derived from external knowledge sources -of ...
We propose to combine implicit and explicit representations for case-based retrieval in two different ways: (i) considering documentlevel knowledge graphs as additional inputs for end-to-end neural scoring ...
dblp:conf/desires/Marchesin18
fatcat:67pchzf3a5b4la6mmgdzwelr2m
LoL: A Comparative Regularization Loss over Query Reformulation Losses for Pseudo-Relevance Feedback
[article]
2022
arXiv
pre-print
Pseudo-relevance feedback (PRF) has proven to be an effective query reformulation technique to improve retrieval accuracy. ...
It aims to alleviate the mismatch of linguistic expressions between a query and its potential relevant documents. ...
Finally, we present an end-to-end query reformulation method based on vector space as an instance of this framework and introduce its special handling for sparse and dense retrieval. ...
arXiv:2204.11545v1
fatcat:zdc4cezkanbjnad5i3yz3g43l4
Dimension Projection among Languages based on Pseudo-relevant Documents for Query Translation
[article]
2016
arXiv
pre-print
Using top-ranked documents in response to a query has been shown to be an effective approach to improve the quality of query translation in dictionary-based cross-language information retrieval. ...
In this paper, we propose a new method for dictionary-based query translation based on dimension projection of embedded vectors from the pseudo-relevant documents in the source language to their equivalents ...
Introduction Pseudo-relevance feedback (PRF) has long been shown to be an effective approach for updating query language models in information retrieval (IR) [6, 12, 4, 3] . ...
arXiv:1605.07844v2
fatcat:vzrmsfkcgjbbxcmulz633pesoy
Co-BERT: A Context-Aware BERT Retrieval Model Incorporating Local and Query-specific Context
[article]
2021
arXiv
pre-print
using pseudo relevance feedback before modeling the relevance of a group of documents jointly. ...
To mitigate this gap, in this work, an end-to-end transformer-based ranking model, named Co-BERT, has been proposed to exploit several BERT architectures to calibrate the query-document representations ...
to encode long sequence, like Reformer [15] , Conformer [24] , and, Longformer [5] , we plan to use them to replace the BERT model in Co-BERT, hoping to better incorporate the cross-documents interaction ...
arXiv:2104.08523v1
fatcat:w24yxin3vjfqdl4oi74s7lsuba
Literature Retrieval for Precision Medicine with Neural Matching and Faceted Summarization
2020
Findings of the Association for Computational Linguistics: EMNLP 2020
In this paper, we present a document reranking approach that combines neural query-document matching and text summarization toward such retrieval scenarios. ...
The outcomes of (b) and (c) are used to essentially transform a candidate document into a concise summary that can be compared with the query at hand to compute a relevance score. ...
Another straightforward idea is to reuse generated pseudo-query sentences in the eDisMax query by Solr, as a form of pseudo relevance feedback. ...
doi:10.18653/v1/2020.findings-emnlp.304
pmid:34541588
pmcid:PMC8444997
fatcat:elsv2diavfe7vawrnwoohyagqm
Improving Query Representations for Dense Retrieval with Pseudo Relevance Feedback
[article]
2021
arXiv
pre-print
This paper proposes ANCE-PRF, a new query encoder that uses pseudo relevance feedback (PRF) to improve query representations for dense retrieval. ...
Dense retrieval systems conduct first-stage retrieval using embedded representations and simple similarity metrics to match a query to documents. ...
A common technique to improve query understanding in sparse retrieval systems is pseudo relevance feedback (PRF) [7, 19, 34] , which uses the top retrieved documents from an initial search as additional ...
arXiv:2108.13454v1
fatcat:4tt4lquosbce3iwzzkbk64jd3q
Task-Oriented Query Reformulation with Reinforcement Learning
[article]
2017
arXiv
pre-print
In this work, we introduce a query reformulation system based on a neural network that rewrites a query to maximize the number of relevant documents returned. ...
We train this neural network with reinforcement learning. The actions correspond to selecting terms to build a reformulated query, and the reward is the document recall. ...
Pseudo Relevance Feedback (PRF-TFIDF): A query is expanded with terms from the documents retrieved by a search engine using the original query. ...
arXiv:1704.04572v4
fatcat:gpt2ybtpyvf3rmarzaqa3scfji
Task-Oriented Query Reformulation with Reinforcement Learning
2017
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
In this work, we introduce a query reformulation system based on a neural network that rewrites a query to maximize the number of relevant documents returned. ...
We train this neural network with reinforcement learning. The actions correspond to selecting terms to build a reformulated query, and the reward is the document recall. ...
Pseudo Relevance Feedback (PRF-TFIDF): A query is expanded with terms from the documents retrieved by a search engine using the original query. ...
doi:10.18653/v1/d17-1061
dblp:conf/emnlp/NogueiraC17
fatcat:vukpwjvbozclbiut7nmlaypqxu
Interactive Spoken Content Retrieval by Deep Reinforcement Learning
[article]
2016
arXiv
pre-print
This is impossible for spoken content retrieval, because the retrieved items are difficult to show on screen. ...
The suitable actions depend on the retrieval status, for example requesting for extra information from the user, returning a list of topics for user to select, etc. ...
We adopt the query-regularized mixture model [20, 21, 22] previously proposed for pseudo-relevance feedback to estimate the new query models θ q . ...
arXiv:1609.05234v1
fatcat:67kww7wyvbffhkdwzlfnjrmvna
Integrating Lexical and Temporal Signals in Neural Ranking Models for Searching Social Media Streams
[article]
2017
arXiv
pre-print
To our knowledge, we are the first to integrate lexical and temporal signals in an end-to-end neural network architecture, in which existing neural ranking models are used to generate query-document similarity ...
Our intuition is that neural networks provide a more expressive framework to capture the temporal coherence of neighboring documents in time. ...
pseudo relevance feedback. ...
arXiv:1707.07792v1
fatcat:u4rvxt2m7bbk7h4qhypj4ba3ly
Learning to Expand: Reinforced Pseudo-relevance Feedback Selection for Information-seeking Conversations
[article]
2020
arXiv
pre-print
With the development of research in such conversation systems, the pseudo-relevance feedback (PRF) has demonstrated its effectiveness in incorporating relevance signals from external documents. ...
In this work, we treat the PRF selection as a learning task and proposed a reinforced learning based method that can be trained in an end-to-end manner without any human annotations. ...
[13] proposed an end-to-end neural model to make the query directly interacts with the retrieved documents. ...
arXiv:2011.12771v1
fatcat:frpxvpshr5bb3jksiqrkv7lzku
Relevance-based Word Embedding
2017
Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval - SIGIR '17
To train our models, we used over six million unique queries and the top ranked documents retrieved in response to each query, which are assumed to be relevant to the query. ...
to the relevant or non-relevant class for each query. ...
Acknowledgements. e authors thank Daniel Cohen, Mostafa Dehghani, and Qingyao Ai for their invaluable comments. is work was supported in part by the Center for Intelligent Information Retrieval. ...
doi:10.1145/3077136.3080831
dblp:conf/sigir/ZamaniC17
fatcat:dloqxjo34nbifevkjlzdah3lxm
« Previous
Showing results 1 — 15 out of 2,911 results