177,121 Hits in 5.0 sec

Boosting Density Function Estimators [chapter]

Franck Thollard, Marc Sebban, Philippe Ezequel
2002 Lecture Notes in Computer Science  
In this paper, we focus on the adaptation of boosting to density function estimation, useful in a number of fields including Natural Language Processing and Computational Biology.  ...  In this paper, we do not take into account classification errors to optimize a classifier, but rather density estimation errors to optimize an estimator (here a probabilistic automaton) of a given target  ...  moment, i.e. the classic boosting, the boosting applied to protototype selection, and the boosting of density function estimation.  ... 
doi:10.1007/3-540-36755-1_36 fatcat:wnfgb3nbb5ffrjgu74wvvlt3mi

On Asymptotic Mean Integrated Squared Error's Reduction Techniques in Kernel Density Estimation

I. U. Siloko, E. A. Siloko, O. Ikpotokin, C. C. Ishiekwene, B.A. Afere
2019 International Journal of Computational and Theoretical Statistics  
The asymptotic mean integrated squared error (AMISE) is an optimality criterion function that measures the performance of a kernel density estimator.  ...  The techniques of asymptotic mean integrated squared error's reduction in kernel density estimation is the focus of this paper.  ...  Bandwidths and AMISE of Kernel Boosting Density Estimate m=1 m=2 Eruption length (minutes) Density Estimate Density Estimate Density Estimate Eruption length (minutes) Eruption length  ... 
doi:10.12785/ijcts/060110 fatcat:cimuot76cfdc5gldvr3d3niz2y

A Meshsize Boosting Algorithm In Kernel Density Estimation

CC Ishiekwene, SM Ogbonmwan, JE Osemwenkhae
2008 Journal of Science and Technology (Ghana)  
This paper proposes a new algorithm for boosting in kernel density estimation (KDE).  ...  Improved boosting algorithm using confidence rated prediction. Machine learn. 37: 297 -336. A meshsize boosting algorithm in kernel density estimation Ishiekwene et al.  ...  INTRODUCTION Kernel density estimation (KDE) is a process of constructing a density function from a given set of data.  ... 
doi:10.4314/just.v28i2.33120 fatcat:ov2guojanrfxfaq7hrvkj5bc3i

Looking for lumps: boosting and bagging for density estimation

Greg Ridgeway
2002 Computational Statistics & Data Analysis  
Analogous to the boosting framework, the algorithms iteratively mix the current density estimator with an additional density chosen in a greedy fashion to optimize a fit criterion.  ...  In this paper I extend these methods to the design of algorithms for density estimation for large, noisy, high dimensional datasets.  ...  The author is grateful to Werner Stuetzle for discussions on boosting with respect to density estimation, Larry Wasserman for his suggestion to use EM to make the modification proposals, and Dan McCaffrey  ... 
doi:10.1016/s0167-9473(01)00066-4 fatcat:kaue3ze2ujdf7gxlymkbzk2ure

Learned-loss boosting

Giles Hooker, James O. Ramsay
2012 Computational Statistics & Data Analysis  
This is done by jointly estimating a residual density for a prediction function at the same time as the prediction function is chosen to maximize the log likelihood of the estimated density.  ...  as boosting continues.  ...  Penalized Density Estimation The central technique in this paper is the estimation of a log density for the residuals of a prediction function.  ... 
doi:10.1016/j.csda.2012.05.019 fatcat:77hgjpjpynfd5gk7q6nfe4t7pi

Adaptive Kernel in Meshsize Boosting Algorithm in KDE

CC Ishiekwene, E Nwelih
2011 African Research Review  
This paper proposes the use of adaptive kernel in a meshsize boosting algorithm in kernel density estimation.  ...  This weight placed on the kernel estimator, is a ratio of a log function in which the denominator is a leave-one-out estimate of the density function.  ...  Introduction Boosting in kernel density estimation was first proposed by Schapire (1990) .  ... 
doi:10.4314/afrrev.v5i1.64541 fatcat:nyerkrhv3vfqpdqv6fgynn7i4u

Boosting kernel density estimates: A bias reduction technique?

M. Di Marzio
2004 Biometrika  
This paper proposes an algorithm for boosting kernel density estimates.  ...  We show that boosting is closely linked to a previously proposed method of bias reduction and indicate how it should enjoy similar properties.  ...  Instead of decision trees, Rather than the traditional goal of boosting classifiers, in this paper we consider the goal of density estimation, and investigate how kernel density estimates can be boosted  ... 
doi:10.1093/biomet/91.1.226 fatcat:xazjanyfsffcrfqootwwdkmkxe

Boosted coefficient models

Joseph Sexton, Petter Laake
2011 Statistics and computing  
Examples are given by conditional density estimation, hazard regression and regression with a functional response.  ...  We propose to estimate these coefficient functions using boosted tree models. Algorithms are provided for the above three situations, and real data sets are used to investigate their performance.  ...  covariates into a set of (univariate) real valued functions. For instance, in conditional density estimation the object is to estimate how the density of the response depends on the covariates.  ... 
doi:10.1007/s11222-011-9253-0 fatcat:yxqt4meecvhp3gwy4wyuweslzy

Page 4028 of Mathematical Reviews Vol. , Issue 2004e [page]

2004 Mathematical Reviews  
This article provides nonparametric estimators that nearly equal the MLE estimates for the marginal densities while being close to the kernel nonparametric density estimates for the joint density estimates  ...  estimates that incorporate marginal parametric density information.  ... 

Continuous Optimization based-on Boosting Gaussian Mixture Model

Bin Lin, Xian-ji Wang, Run-tian Zhong, Zhen-quan Zhuang
2006 18th International Conference on Pattern Recognition (ICPR'06)  
A new Estimation of Distribution Algorithm(EDA) based-on Gaussian Mixture Model (GMM) is proposed, in which boosting, an efficient ensemble learning method, is adopted to estimate GMM.  ...  Moreover, since boosting can be viewed as a gradient search for a good fit of some objective in function space, the new EDA is time efficient.  ...  Rosset et al. successfully applied the boosting methodology to the unsupervised learning problem of density estimation and proposed a general boosting density estimation framework [10] .  ... 
doi:10.1109/icpr.2006.412 dblp:conf/icpr/LinWZZ06 fatcat:2hyrvkloaffl5g3gc6nwh6w5rq

Prediction of Top Tourist Attraction Spots using Learning Algorithms

2019 International journal of recent technology and engineering  
For this purpose four algorithms such as Kernel Density Estimation, K- Nearest Neighbor, Random forest and XG Boost have been used.  ...  The findings revealed that XG Boost yields better results in terms of accuracy than other three algorithms.  ...  We used four machine learning algorithms such as KNN, KDE, Random forest, and XG Boost to train and test the classifiers for tourism-related opinion mining.  ... 
doi:10.35940/ijrte.c4241.098319 fatcat:cicp6vm25zd3tnecdw7cjz7z4q

Learning to Count with CNN Boosting [chapter]

Elad Walach, Lior Wolf
2016 Lecture Notes in Computer Science  
We follow modern learning approaches in which a density map is estimated directly from the input image.  ...  We employ CNNs and incorporate two significant improvements to the state of the art methods: layered boosting and selective sampling.  ...  Learning to Count with CNN Boosting  ... 
doi:10.1007/978-3-319-46475-6_41 fatcat:d74atjc4yjgbphwtf42vxwj4j4

LinCDE: Conditional Density Estimation via Lindsey's Method [article]

Zijun Gao, Trevor Hastie
2021 arXiv   pre-print
In this paper, we propose a conditional density estimator based on gradient boosting and Lindsey's method (LinCDE).  ...  In particular, when suitably parametrized, LinCDE will produce smooth and non-negative density estimates. Furthermore, like boosted regression trees, LinCDE does automatic feature selection.  ...  LinCDE boosting can be used to estimate the density ratio provided with any nuisance baseline conditional density estimate.  ... 
arXiv:2107.12713v2 fatcat:wzacxdz6bjh4nmryd6ec6almze

Gradient Boosted Normalizing Flows [article]

Robert Giaquinto, Arindam Banerjee
2020 arXiv   pre-print
We demonstrate the effectiveness of this technique for density estimation and, by coupling GBNF with a variational autoencoder, generative modeling of images.  ...  We propose an alternative: Gradient Boosted Normalizing Flows (GBNF) model a density by successively adding new NF components with gradient boosting.  ...  Figure 4 : 4 Matching the energy functions from Figure 5 : 5 Density estimation for 2D toy data.  ... 
arXiv:2002.11896v4 fatcat:tdq244q3bbeqzh2eq62ep6ukna

Boosting Independent Component Analysis [article]

Yunpeng Li, ZhaoHui Ye
2021 arXiv   pre-print
Our algorithm fills the gap in the nonparametric independent component analysis by introducing boosting to maximum likelihood estimation.  ...  In this paper, we present a novel boosting-based algorithm for independent component analysis.  ...  Estimating the source's density via boosting Boosting [34] , [35] is a technique of combining multiple weak learners to produce a powerful committee, whose performance is significantly better than any  ... 
arXiv:2112.06920v2 fatcat:tfqktff5hzfmbi75e7nmt6jvge
« Previous Showing results 1 — 15 out of 177,121 results