Filters








118,674 Hits in 6.1 sec

Boosting with early stopping: Convergence and consistency

Tong Zhang, Bin Yu
2005 Annals of Statistics  
Using the numerical convergence result, we find early-stopping strategies under which boosting is shown to be consistent based on i.i.d. samples, and we obtain bounds on the rates of convergence for boosting  ...  Boosting is one of the most significant advances in machine learning for classification and regression.  ...  Theorem 4.1 can then be applied to show the convergence of such boosting procedures.  ... 
doi:10.1214/009053605000000255 fatcat:l3eocf3it5bijmltpet6fo7p7q

Proximal boosting and variants [article]

Erwan Fouillen, Claire Boyer, Maxime Sangnier
2021 arXiv   pre-print
From an optimization point of view, the learning procedure of gradient boosting mimics a gradient descent on a functional variable.  ...  Theoretical convergence is proved for the first two procedures under different hypotheses on the empirical risk and advantages of leveraging proximal methods for boosting are illustrated by numerical experiments  ...  They are also indebted to the Associate Editor and the reviewers for suggesting efforts on the theoretical and numerical sides of the paper.  ... 
arXiv:1808.09670v3 fatcat:phpyn6xpfzgtphrkkwxz6levxy

Towards Practical Lottery Ticket Hypothesis for Adversarial Training [article]

Bai Li, Shiqi Wang, Yunhan Jia, Yantao Lu, Zhenyu Zhong, Lawrence Carin, Suman Jana
2020 arXiv   pre-print
on CIFAR-10 to achieve the state-of-the-art robustness.  ...  We show there exists a subset of the aforementioned sub-networks that converge significantly faster during the training process and thus can mitigate the cost issue.  ...  As a result, the learning rate scheduling obscures the improvement on convergence rates of boosting tickets; (ii) Due to fast convergence, boosting tickets tend to overfit, as observed in ResNet-18 after  ... 
arXiv:2003.05733v1 fatcat:zrzuwlgvmzfwxlspws34y5zhqq

InfiniteBoost: building infinite ensembles with gradient descent [article]

Alex Rogozhnikov, Tatiana Likhomanenko
2018 arXiv   pre-print
The proposed algorithm is evaluated on the regression, classification, and ranking tasks using large scale, publicly available datasets.  ...  of trees without the over-fitting effect.  ...  the previous predictors in the ensemble (boosting procedure).  ... 
arXiv:1706.01109v2 fatcat:z3kxlhjqcnhixayj52h7lm4gya

A Boosting Procedure for Variational-Based Image Restoration

Samad Wali
2018 Numerical Mathematics: Theory, Methods and Applications  
The convergence analysis of the boosting process is shown in a special case of total variation image denoising with a "disk" input data.  ...  In each iteration of the boosting scheme, the variational model is solved by augmented Lagrangian method.  ...  Convergence analysis for noise free data in continuous setting In this subsection we show some discussion on the convergence of our boosting method.  ... 
doi:10.4208/nmtma.oa-2017-0046 fatcat:54jqqxjygvh4tn4htzqfmipvqy

Boosted Histogram Transform for Regression

Yuchao Cai, Hanyuan Hang, Hanfang Yang, Zhouchen Lin
2020 International Conference on Machine Learning  
Moreover, if the target function resides in the subspace C 1,α , for the first time we manage to explain the benefits of the boosting procedure, by establishing the upper bound of the convergence rate  ...  In this paper, we propose a boosting algorithm for regression problems called boosted histogram transform for regression (BHTR) based on histogram transforms composed of random rotations, stretchings,  ...  Yang is supported by National Key R&D Program of China (2018YFC0830300). Z. Lin is supported by NSF  ... 
dblp:conf/icml/CaiHYL20 fatcat:ekkrrrqz6zdw5c6jfwysg256iu

Improved Secant Method Applied to Boost Trajectory Optimization

R.G. Gottlieb, W.T. Fowler
1977 Journal of Spacecraft and Rockets  
Ap- proximations of the Jacobian using a least squares procedure helps to speed convergence of the procedure.  ...  In order to obtain starting values for the secant procedure which were within its envelope of convergence, a min-H procedure was employed.  ... 
doi:10.2514/3.57166 fatcat:jjqtvk3vgjekhlkmlooi4apqba

Experiments With Repeating Weighted Boosting Search for Optimization in Signal Processing Applications

S. Chen, X.X. Wang, C.J. Harris
2005 IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics)  
The paper proposes a guided global search optimization technique, referred to as the repeated weighted boosting search.  ...  Heuristic explanation is given for the global search capability of this technique.  ...  The RWBS algorithm described in Section II is used to append kernels one by one in an orthogonal forward selection (OFS) procedure.  ... 
doi:10.1109/tsmcb.2005.845398 pmid:16128453 fatcat:54n6fzvgkndstm4dvxxdtcitju

Spatial pooling for greyscale images

John Thornton, Andrew Srbic
2012 International Journal of Machine Learning and Cybernetics  
Specifically, we evaluate the performance of a recently proposed binary spatial pooling algorithm on a well-known benchmark of greyscale natural images.  ...  In the process, we augment the algorithm to handle greyscale images, and to produce better quality encodings of binary images.  ...  Acknowledgements We thank Ben Willmore and David Tolhurst for supplying the images used in their original paper on sparse neural codes [23] .  ... 
doi:10.1007/s13042-012-0087-7 fatcat:r5w2ysm2rrcvha3xgi75y466gq

LUT-Based Adaboost for Gender Classification [chapter]

Bo Wu, Haizhou Ai, Chang Huang
2003 Lecture Notes in Computer Science  
Because of this limitation of the hypothesis model, the training procedure is hard to converge.  ...  This algorithm converges quickly and results in efficient classifiers. The experiments and analysis show that the LUT weak classifiers are more suitable for boosting procedure than threshold ones.  ...  This makes LUT weak very suitable for boosting procedure. Experiment results show the efficiency of our algorithm.  ... 
doi:10.1007/3-540-44887-x_13 fatcat:v7omrhhu6rhenmdxagqdvxme5i

Behavior of linear L2-boosting algorithms in the vanishing learning rate asymptotic [article]

Clément Dombry
2020 arXiv   pre-print
We investigate the asymptotic behaviour of gradient boosting algorithms when the learning rate converges to zero and the number of iterations is rescaled accordingly.  ...  Besides, the training and test error of the limiting procedure are thoroughly analyzed.  ...  Finally, we discuss the notion of stability of the boosting procedure. It requires that the output of the boosting algorithm does not explodes for large time values. Definition 2.9.  ... 
arXiv:2012.14657v1 fatcat:pg5qje6oazdi5jbyvosouk4zgi

Multiple Lyapunov Function Based Reaching Condition for Orbital Existence of Switching Power Converters

S.K. Mazumder, K. Acharya
2008 IEEE transactions on power electronics  
of convergence of a SPC.  ...  Further, the criterion is modified to distinguish the different modes (i.e., sliding and asymptotic modes and combination of the two fundamental modes) of convergences of the reaching dynamics.  ...  ACKNOWLEDGMENT The authors would like to thank Texas Instruments and Altera for their support in building the experimental setup for this work.  ... 
doi:10.1109/tpel.2008.921065 fatcat:xwitsmwjgvgj3ieko4tajayqiu

On Boosting Improvement: Error Reduction and Convergence Speed-Up [chapter]

Marc Sebban, Henri-Maxime Suchier
2003 Lecture Notes in Computer Science  
The convergence speed of boosting is also penalized on such databases, where there is a large overlap of probability density functions of the classes to learn (large Bayesian error).  ...  Boosting is not only the most efficient ensemble learning method in practice, but also the one based on the most robust theoretical properties.  ...  Convergence Speed If the negative impacts of noisy data on boosting performances have been frequently mentioned in the literature, the causes of a slowing down of convergence have rarely been studied.  ... 
doi:10.1007/978-3-540-39857-8_32 fatcat:ox4jq26gkjcydhxjc7ydy2ixy4

Page 96 of Journal of Spacecraft and Rockets Vol. 14, Issue 2 [page]

1977 Journal of Spacecraft and Rockets  
Ap- proximations of the Jacobian using a least squares procedure helps to speed convergence of the procedure.  ...  The secant procedure basically calls for one to guess the unknown 2’s at ty (denote these quantities ¥) and then to integrate a trajectory based on these guesses.  ... 

Augmented Spatial Pooling [chapter]

John Thornton, Andrew Srbic, Linda Main, Mahsa Chitsaz
2011 Lecture Notes in Computer Science  
Specifically, we evaluate the performance of a recently proposed binary spatial pooling algorithm on a well-known benchmark of greyscale natural images.  ...  Our main contribution is to augment the algorithm to handle greyscale images, and to produce better quality encodings of binary images.  ...  Acknowledgments: We thank David Tolhurst and Ben Willmore for supplying the images used in their original paper.  ... 
doi:10.1007/978-3-642-25832-9_27 fatcat:l6zos4qhwvdilnmey5dpjkmdyi
« Previous Showing results 1 — 15 out of 118,674 results