Filters








16,515 Hits in 1.5 sec

Agnostic Boosting [chapter]

Shai Ben-David, Philip M. Long, Yishay Mansour
2001 Lecture Notes in Computer Science  
We show a boosting algorithm that using the weak agnostic learner computes a hypothesis whose error is at most maxfc1( )er(F) c 2 ( ) ; g, in time polynomial in 1= .  ...  We extend the boosting paradigm to the realistic setting of agnostic learning, that is, to a setting where the training sample is generated by an arbitrary (unknown) probability distribution over examples  ...  Agnostic Boosting In this section we prove our main theorem about boosting using -weak agnostic learner.  ... 
doi:10.1007/3-540-44581-1_33 fatcat:wc7x7pes6fcjlaldhpwajdtsae

Distribution-Specific Agnostic Boosting [article]

Vitaly Feldman
2009 arXiv   pre-print
This allows boosting a distribution-specific weak agnostic learner to a strong agnostic learner with respect to the same distribution.  ...  We consider the problem of boosting the accuracy of weak learning algorithms in the agnostic learning framework of Haussler (1992) and Kearns et al. (1992).  ...  Agnostic Boosting The main component of the agnostic boosting algorithm in the work of Kalai et al.  ... 
arXiv:0909.2927v1 fatcat:4ckz5ryasngmzmu4bxsbgietam

Communication Efficient Distributed Agnostic Boosting [article]

Shang-Tse Chen, Maria-Florina Balcan, Duen Horng Chau
2016 arXiv   pre-print
We consider the problem of learning from distributed data in the agnostic setting, i.e., in the presence of arbitrary forms of noise.  ...  Our main contribution is a general distributed boosting-based procedure for learning an arbitrary concept space, that is simultaneously noise tolerant, communication efficient, and computationally efficient  ...  The first agnostic boosting algorithm was proposed in [5] .  ... 
arXiv:1506.06318v2 fatcat:5s7iurvrhzbgjeluu3ioimwgj4

On agnostic boosting and parity learning

Adam Tauman Kalai, Yishay Mansour, Elad Verbin
2008 Proceedings of the fourtieth annual ACM symposium on Theory of computing - STOC 08  
Our agnostic boosting framework is completely general and may be applied to other agnostic learning problems.  ...  Hence, it also sheds light on the actual difficulty of agnostic learning by showing that full agnostic boosting is indeed possible.  ...  We are also grateful to Rocco Servedio for allowing us to reproduce parts of the analysis of the earlier boosting algorithm.  ... 
doi:10.1145/1374376.1374466 dblp:conf/stoc/KalaiMV08 fatcat:kcgl7xxjmfa3tdwae6abzr4kgu

Online Agnostic Boosting via Regret Minimization [article]

Nataly Brukhim, Xinyi Chen, Elad Hazan, Shay Moran
2020 arXiv   pre-print
In this work we provide the first agnostic online boosting algorithm; that is, given a weak learner with only marginally-better-than-trivial regret guarantees, our algorithm boosts it to a strong learner  ...  While in statistical learning numerous boosting methods exist both in the realizable and agnostic settings, in online learning they exist only in the realizable case.  ...  Why Online Agnostic Boosting?  ... 
arXiv:2003.01150v1 fatcat:7rfda3kyabgovdlysijnuccrbe

Boosting mono-jet searches with model-agnostic machine learning [article]

Thorben Finke, Michael Krämer, Maximilian Lipp, Alexander Mück
2022 arXiv   pre-print
For the example of a strongly interacting dark matter model, we employ simulated data to show that the discovery potential of an existing generic search can be boosted considerably.  ...  In such a setup, model-agnostic ML is particularly promising since it can go far beyond counting events.  ...  Model-agnostic Machine Learning (ML) techniques have been shown to provide sensitivity to various new physics signatures.  ... 
arXiv:2204.11889v2 fatcat:idgqdt4yy5bmtipyyikdogfnk4

Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning [chapter]

Dmitry Gavinsky
2002 Lecture Notes in Computer Science  
The boosting approach was originally suggested for the standard PAC model; we analyze possible applications of boosting in the context of agnostic learning, which is more realistic than the PAC model.  ...  We derive a lower bound for the final error achievable by boosting in the agnostic model and show that our algorithm actually achieves that accuracy (within a constant factor).  ...  Claim 2 Agnostic Boosting In this section we apply AdaF lat to agnostic boosting.  ... 
doi:10.1007/3-540-36169-3_10 fatcat:nu6xunw5jfhfxam3d32k36auka

Improper Learning by Refuting

Pravesh K. Kothari, Roi Livni, Marc Herbstritt
2018 Innovations in Theoretical Computer Science  
This has little bearing, however, on the sample complexity of efficient agnostic learning.  ...  Our work can be seen as making the relationship between agnostic learning and refutation implicit in their work into an explicit equivalence.  ...  In the second step, we use the distribution specific agnostic boosting algorithm (see [17] ) to boost the accuracy of the weak learner to obtain an agnostic learner.  ... 
doi:10.4230/lipics.itcs.2018.55 dblp:conf/innovations/KothariL18 fatcat:nikdomcdarhwpezrw7aetrtr24

Agnostic Learning by Refuting [article]

Pravesh K. Kothari, Roi Livni
2017 arXiv   pre-print
This has little bearing, however, on the sample complexity of efficient agnostic learning.  ...  Our work can be seen as making the relationship between agnostic learning and refutation implicit in their work into an explicit equivalence.  ...  In the second step, we use the distribution specific agnostic boosting algorithm (see [KK09] ) to boost the accuracy of the weak learner to obtain an agnostic learner.  ... 
arXiv:1709.03871v2 fatcat:65dtn7ses5ge5bzriufwfzhsqe

Communication-Aware Collaborative Learning [article]

Avrim Blum, Shelby Heinecke, Lev Reyzin
2020 arXiv   pre-print
We develop communication efficient collaborative PAC learning algorithms using distributed boosting.  ...  Suppose Distributed Agnostic Boosting has access to a β-weak agnostic learner. Let β be a fixed constant.  ...  We now recall the communication complexity of Distributed Agnostic Boosting. Theorem 20 (Chen, Balcan, and Chau 2016) . Suppose Distributed Agnostic Boosting has access to a β-weak agnostic learner.  ... 
arXiv:2012.10569v1 fatcat:ps5lh4yw25aajebz5s73iw4hpe

Tree-based local explanations of machine learning model predictions, AraucanaXAI [article]

Enea Parimbelli, Giovanna Nicora, Szymon Wilk, Wojtek Michalowski, Riccardo Bellazzi
2021 arXiv   pre-print
Increasingly complex learning methods such as boosting, bagging and deep learning have made ML models more accurate, but harder to understand and interpret.  ...  Discussion Our AraucanaXAI approach has a number of advantages over comparable approaches for local, model-agnostic, post-hoc explanations.  ...  Conclusion Local, model-agnostic post-hoc explanations constitute a valuable effort towards tackling the performance vs interpretability tradeoff.  ... 
arXiv:2110.08272v1 fatcat:njki4h3qxvf4hbo3olfj54snmi

A Robust Boosting Algorithm [chapter]

Richard Nock, Patrice Lefaucheur
2002 Lecture Notes in Computer Science  
This last property is ruled out for voting-based Boosting algorithms like AdaBoost.  ...  Among its properties of practical relevance, the algorithm has significant resistance against noise, and is efficient even in an agnostic learning setting.  ...  Therefore, for each possible t, SFboost agnostically learns the target concept. To our knowledge, SFboost is the first Boosting algorithm which is also an agnostic/robust learning algorithm.  ... 
doi:10.1007/3-540-36755-1_27 fatcat:nyjzajdtjjch7ccgc7cedel5jm

Omnipredictors [article]

Parikshit Gopalan, Adam Tauman Kalai, Omer Reingold, Vatsal Sharan, Udi Wieder
2021 arXiv   pre-print
In addition, we show how multicalibration can be viewed as a solution concept for agnostic boosting, shedding new light on past results.  ...  Agnostic boosting from multicalibration.  ...  The boosting approach to agnostic learning is to start from a weak agnostic learner, which only guarantees some non-trivial correlation with the labels, and boost it to obtain a classifier that agnostically  ... 
arXiv:2109.05389v1 fatcat:4chefbjh5vfl5db64xyht6ut5e

Model Agnostic Combination for Ensemble Learning [article]

Ohad Silbert, Yitzhak Peleg, Evi Kopelowitz
2020 arXiv   pre-print
Being agnostic to the number of sub-models enables addition and replacement of sub-models to the combination even after deployment, unlike many of the current methods for ensembling such as stacking, boosting  ...  We show that on the Kaggle RSNA Intracranial Hemorrhage Detection challenge, MAC outperforms classical average methods, demonstrates competitive results to boosting via XGBoost for a fixed number of sub-models  ...  Unlike conventual ensembling techniques, like boosting and stacking, in this framework, coined MAC (Model Agnostic Combination), the sub-model are not weighed in the combination with respect to the overall  ... 
arXiv:2006.09025v1 fatcat:kdrtx5e6evhv5jpmawtxnwp7wi

Quantifying Model Complexity via Functional Decomposition for Better Post-Hoc Interpretability [article]

Christoph Molnar, Giuseppe Casalicchio, Bernd Bischl
2019 arXiv   pre-print
Post-hoc model-agnostic interpretation methods such as partial dependence plots can be employed to interpret complex machine learning models.  ...  To quantify the complexity of arbitrary machine learning models, we propose model-agnostic complexity measures based on functional decomposition: number of features used, interaction strength and main  ...  The interaction strength (IAS) is zero for additive models (boosted GAM, (regularized) linear models).  ... 
arXiv:1904.03867v2 fatcat:2zqqri46hrbozbqszojvga46xe
« Previous Showing results 1 — 15 out of 16,515 results