A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2012; you can also visit the original URL.
The file type is application/pdf
.
Filters
Ranking function adaptation with boosting trees
2011
ACM Transactions on Information Systems
Tree adaptation assumes that ranking functions are trained with the Stochastic Gradient Boosting Trees method − a gradient boosting method on regression trees. ...
In this paper, we propose a new approach called tree based ranking function adaptation ("Trada") to effectively utilize these data sources for training cross-domain ranking functions. ...
Although it can be applied to any regression-tree based ranking model, we will use rank-ing functions trained with the gradient boosting trees (GBT) method [Friedman 2001] in this paper. ...
doi:10.1145/2037661.2037663
fatcat:lbhweoqkzve7phaxrpfse3utbe
Model Adaptation via Model Interpolation and Boosting for Web Search Ranking
[article]
2019
arXiv
pre-print
This paper explores two classes of model adaptation methods for Web search ranking: Model Interpolation and error-driven learning approaches based on a boosting algorithm. ...
The tree-based boosting algorithm achieves the best performance on most of the closed test sets where the test data and the training data are similar, but its performance drops significantly on the open ...
We thank Steven Yao's group at Microsoft Bing Search for their help with the experiments. ...
arXiv:1907.09471v1
fatcat:t3yo5aomxnhh3nooyjijtblvgq
Adapting boosting for information retrieval measures
2009
Information retrieval (Boston)
In addition, we show that starting with a previously trained model, and boosting using its residuals, furnishes an effective technique for model adaptation, and we give significantly improved results for ...
We present a new ranking algorithm that combines the strengths of two previous methods: boosted tree classification, and LambdaRank, which has been shown to be empirically optimal for a widely used information ...
MART is a boosted tree algorithm that performs gradient descent in function space [18] . ...
doi:10.1007/s10791-009-9112-1
fatcat:5i6mlb2gqveqljo55y3axblq2e
Multi-task learning for boosting with application to web search ranking
2010
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '10
In this paper we propose a novel algorithm for multi-task learning with boosted decision trees. ...
We evaluate our learning method on web-search ranking data sets from several countries. ...
As already noted in the introduction, boosted decision trees are very well suited for our web search ranking problem and we now present our algorithm, multi-boost for multi-task learning with boosting. ...
doi:10.1145/1835804.1835953
dblp:conf/kdd/ChapelleSVWZT10
fatcat:36ewsgphr5dohmy7k4xhk4mvuu
Boosted multi-task learning
2010
Machine Learning
In this paper we propose a novel algorithm for multi-task learning with boosted decision trees. ...
Our algorithm is derived using the relationship between 1 -regularization and boosting. We evaluate our learning method on web-search ranking data sets from several countries. ...
As already noted in the introduction, boosted decision trees are very well suited for our web search ranking problem and we now present our algorithm, multi-boost for multi-task learning with boosting. ...
doi:10.1007/s10994-010-5231-6
fatcat:7r6t3zwlk5d3hirxfjkuur4qq4
InfiniteBoost: building infinite ensembles with gradient descent
[article]
2018
arXiv
pre-print
Two notable ensemble methods widely used in practice are gradient boosting and random forests. ...
of trees without the over-fitting effect. ...
For ranking task the fixed capacity value is used because the loss function is not convex in this case and the adaptation on the holdout significantly underestimates capacity. ...
arXiv:1706.01109v2
fatcat:z3kxlhjqcnhixayj52h7lm4gya
Tree adaptation assumes that ranking functions are trained with regression-tree based modeling methods, such as Gradient Boosting Trees. ...
In this paper, we propose a new approach called tree based ranking function adaptation ("tree adaptation") to address this problem. ...
Although it can be applied to any regression-tree based ranking models, we will use ranking functions trained with the gradient boosting trees (GBT) method [10] in this paper. ...
doi:10.1145/1458082.1458233
dblp:conf/cikm/ChenLWSHT08
fatcat:5o3gpy6ambc2vot2jfubpnvoua
Active learning of tree tensor networks using optimal least-squares
[article]
2021
arXiv
pre-print
Practical strategies are proposed for adapting the feature spaces and ranks to achieve a prescribed error. ...
In this paper, we propose new learning algorithms for approximating high-dimensional functions using tree tensor networks in a least-squares setting. ...
D Estimation of the α-ranks of a function u to perform tree adaptation. We present here the algorithm that estimates α-ranks for tree adaptation. The strategy is described in Section 4.1. ...
arXiv:2104.13436v1
fatcat:3wzbu57hujcbbelf2ipdhyiini
Review of statistical methods for survival analysis using genomic data
2019
Genomics & Informatics
adapted to survival analysis. ...
We review traditional survival methods and regularization methods, with various penalty functions, for the analysis of high-dimensional genomics, and describe machine learning techniques that have been ...
Both mboost and Cox-Boost are based on gradient boosting, but differ in the sense that mboost is an adaptation of model-based boosting, whereas Cox-Boost adapts likelihood-based boosting. ...
doi:10.5808/gi.2019.17.4.e41
pmid:31896241
pmcid:PMC6944043
fatcat:dw7rubh7v5a3hcgptsyqnydk6a
McRank: Learning to Rank Using Multiple Classification and Gradient Boosting
2007
Neural Information Processing Systems
We propose using the Expected Relevance to convert class probabilities into ranking scores. The class probabilities are learned using a gradient boosting tree algorithm. ...
function, although the reported results in [5] are for pairwise. ...
Regression-based Ranking Using Boosting Tree Algorithm With slight modifications, the boosting tree algorithm can be used for regressions. ...
dblp:conf/nips/LiBW07
fatcat:hj7zoqzcofcidjiiamtfpvxojq
Gradient Boosting Machine: A Survey
[article]
2019
arXiv
pre-print
optimization, 3. loss function estimations, and 4. model constructions. 5. application of boosting in ranking. ...
In this survey, we discuss several different types of gradient boosting algorithms and illustrate their mathematical frameworks in detail: 1. introduction of gradient boosting leads to 2. objective function ...
Meanwhile, boosting performance depends on bias reduction when the weighted sampling is replaced with weighted tree fitting (Friedman et al., 2000) . ...
arXiv:1908.06951v1
fatcat:fgofwpdrn5hptfdqv2bzgbwcou
Gradient Boosting Neural Networks: GrowNet
[article]
2020
arXiv
pre-print
General loss functions are considered under this unified framework with specific examples presented for classification, regression, and learning to rank. ...
A fully corrective step is incorporated to remedy the pitfall of greedy function approximation of classic gradient boosting decision tree. ...
The adaptive boosting can be seen as a specific version of the gradient boosting algorithm where a simple exponential loss function is used [10] . ...
arXiv:2002.07971v2
fatcat:ck4smv5vrne7bf2f4crl7lxgei
On domain similarity and effectiveness of adapting-to-rank
2009
Proceeding of the 18th ACM conference on Information and knowledge management - CIKM '09
Adapting to rank address the the problem of insufficient domainspecific labeled training data in learning to rank. However, the initial study shows that adaptation is not always effective. ...
In this paper, we investigate the relationship between the domain similarity and the effectiveness of domain adaptation with the help of two domain similarity measure: relevance correlation and sample ...
the source domain function to the target domain with the Trada tree adaptation algorithm [1] that adjusts the gradient boosting tree structure with the target domain data. ...
doi:10.1145/1645953.1646182
dblp:conf/cikm/ChenBRT09
fatcat:kbcvqwag2ffkjhx6q46bmyshge
Plackett-Luce model for learning-to-rank task
[article]
2019
arXiv
pre-print
List-wise based learning to rank methods are generally supposed to have better performance than point- and pair-wise based. ...
Gradient Boosting and Regression Tree We review gradient boosting [14] as a general framework for function approximation using regression trees as the weak learners, which has been the most successful ...
A ranking function f scores each query-document pair, and returns sorted documents associated with the same query. ...
arXiv:1909.06722v1
fatcat:5rrqzb5vxfbzplvy2jrwgu44wm
Real-Time Face Identification via CNN and Boosted Hashing Forest
2016
2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
This BHF generalizes the Boosted SSC approach for hashing learning with joint optimization of face verification and identification. ...
The family of real-time face representations is obtained via Convolutional Network with Hashing Forest (CNHF). ...
Our CNHF with 2000 output 7-bit coding trees (CNHF-2000×7) achieves 98.59% verification accuracy and 93% rank-1 on LFW (add 3% to rank-1 of basic CNN). ...
doi:10.1109/cvprw.2016.25
dblp:conf/cvpr/VizilterGVK16
fatcat:5wjpjs4rpndn7c3misu5eils7u
« Previous
Showing results 1 — 15 out of 38,421 results