Filters








1,793 Hits in 6.2 sec

Hardness Results for Agnostically Learning Low-Degree Polynomial Threshold Functions [article]

Ilias Diakonikolas and Ryan O'Donnell and Rocco A. Servedio and Yi Wu
2010 arXiv   pre-print
In this paper we prove two hardness results for the problem of finding a low degree polynomial threshold function (PTF) which has the maximum possible agreement with a given set of labeled examples in  ...  Hardness results for maximum agreement problems have close connections to hardness results for proper learning in computational learning theory.  ...  Conclusion We have established two hardness results for proper agnostic learning of low-degree PTFs.  ... 
arXiv:1010.3484v1 fatcat:cobcg7hjbvfzxjggj6cix5z6ny

Hardness Results for Agnostically Learning Low-Degree Polynomial Threshold Functions [chapter]

Ilias Diakonikolas, Ryan O'Donnell, Rocco A. Servedio, Yi Wu
2011 Proceedings of the Twenty-Second Annual ACM-SIAM Symposium on Discrete Algorithms  
In this paper we prove two hardness results for the problem of finding a low degree polynomial threshold function (PTF) which has the maximum possible agreement with a given set of labeled examples in  ...  Hardness results for maximum agreement problems have close connections to hardness results for proper learning in computational learning theory.  ...  Conclusion We have established two hardness results for proper agnostic learning of low-degree PTFs.  ... 
doi:10.1137/1.9781611973082.123 dblp:conf/soda/DiakonikolasOSW11 fatcat:di3xwzhtmjfbhhtsxfly63zkae

Hardness Results for Agnostically Learning Low-Degree Polynomial Threshold Functions

Ilias Diakonikolas, Ryan O'Donnell, Rocco A. Servedio, Yi Wu
2018
In this paper we prove two hardness results for the problem of finding a low degree polynomial threshold function (PTF) which has the maximum possible agreement with a given set of labeled examples in  ...  Hardness results for maximum agreement problems have close connections to hardness results for proper learning in computational learning theory.  ...  Conclusion We have established two hardness results for proper agnostic learning of low-degree PTFs.  ... 
doi:10.1184/r1/6606029 fatcat:vcj3shfhrzdsxgplnd5lv6yxfu

The Polynomial Method is Universal for Distribution-Free Correlational SQ Learning [article]

Aravind Gollakota, Sushrut Karmalkar, Adam Klivans
2020 arXiv   pre-print
These match corresponding positive results using upper bounds on the threshold or approximate degree in the SQ model for PAC or agnostic learning.  ...  or approximate degree of any function class directly imply CSQ lower bounds for PAC or agnostic learning respectively.  ...  as polynomial threshold functions (see [HS07] for a survey) or, for agnostic learning, pointwise approximation [KKMS08] .  ... 
arXiv:2010.11925v2 fatcat:qcxe4nk6ofh3zmvwpq6mqkfbkq

Weighted Polynomial Approximations: Limits for Learning and Pseudorandomness [article]

Mark Bun, Thomas Steinke
2014 arXiv   pre-print
In particular, polynomial approximations to the sign function underly algorithms for agnostically learning halfspaces, as well as pseudorandom generators for halfspaces.  ...  We show that polynomials of any degree cannot approximate the sign function to within arbitrarily low error for a large class of non-log-concave distributions on the real line, including those with densities  ...  Acknowledgements We thank Varun Kanade, Scott Linderman, Raghu Meka, Jelani Nelson, Justin Thaler, Salil Vadhan, Les Valiant, and several anonymous reviewers for helpful discussions and comments.  ... 
arXiv:1412.2457v1 fatcat:xtydl6gpobhbjprfa3lglv25bq

Near-Optimal Statistical Query Lower Bounds for Agnostically Learning Intersections of Halfspaces with Gaussian Marginals [article]

Daniel Hsu, Clayton Sanford, Rocco Servedio, Emmanouil-Vasileios Vlatakis-Gkaragkounis
2022 arXiv   pre-print
Recent work of Diakonikolas et al. (2021) shows that any Statistical Query (SQ) algorithm for agnostically learning the class of intersections of k halfspaces over ℝ^n to constant excess error either must  ...  We consider the well-studied problem of learning intersections of halfspaces under the Gaussian distribution in the challenging agnostic learning model.  ...  bound for other monotone Boolean functions by combining a hardness result on weak learning of Blum et al. (1998a) with an agnostic learning algorithm based on L 1 polynomial approximation by Kalai  ... 
arXiv:2202.05096v1 fatcat:jpmmq7m6qnacvj3gf7zqyafhpi

Agnostic Learning of Disjunctions on Symmetric Distributions [article]

Vitaly Feldman, Pravesh Kothari
2015 arXiv   pre-print
Therefore the learning result above cannot be achieved via ℓ_1-regression with a polynomial basis used in most other agnostic learning algorithms.  ...  This directly gives an agnostic learning algorithm for disjunctions on symmetric distributions that runs in time n^O( (1/ϵ)).  ...  While the problem appears to be hard, strong hardness results are known only if the hypothesis is restricted to be a disjunction or a linear threshold function [Ben-David et al., 2003 , Bshouty and Burroughs  ... 
arXiv:1405.6791v2 fatcat:fwygrqmhyremxpwedlpz5hqgmq

Polynomial regression under arbitrary product distributions

Eric Blais, Ryan O'Donnell, Karl Wimmer
2010 Machine Learning  
They showed that the L 1 polynomial regression algorithm yields agnostic (tolerant to arbitrary noise) learning algorithms with respect to the class of threshold functions-under certain restricted instance  ...  In this work we show how all learning results based on the Low-Degree Algorithm can be generalized to give almost identical agnostic guarantees under arbitrary product distributions on instance spaces  ...  Besides linear threshold functions, the other main example of concentration comes from the original application of the Low Degree Algorithm (Linial et al. 1993) : learning AC 0 functions in quasi-polynomial  ... 
doi:10.1007/s10994-010-5179-6 fatcat:3y7pokynffdv7ftr4gkvcovjly

Moment-Matching Polynomials [article]

Adam Klivans, Raghu Meka
2013 arXiv   pre-print
We give a new framework for proving the existence of low-degree, polynomial approximators for Boolean functions with respect to broad classes of non-product distributions.  ...  Our main application is the first polynomial-time algorithm for agnostically learning any function of a constant number of halfspaces with respect to any log-concave distribution (for any constant accuracy  ...  To prove Theorem 5.4 we use the following results about low-degree polynomials. The lemma gives us control on how fast the moments of low-degree polynomials grow.  ... 
arXiv:1301.0820v1 fatcat:z7zi4eddwnecvaynrb5yi2lxiy

On Robust Concepts and Small Neural Nets

Amit Deshpande, Sushrut Karmalkar
2017 International Conference on Learning Representations  
We also give a polynomial time learning algorithm that outputs a small two-layer linear threshold circuit that approximates such a given function.  ...  We also show weaker generalizations of this to noise-stable polynomial threshold functions and noise-stable boolean functions in general.  ...  This is arguably more natural than other improper learning results for halfspaces via low-degree polynomials.  ... 
dblp:conf/iclr/0001K17 fatcat:todir5xr2fdz5ecxm5ssw4vyja

Reliably Learning the ReLU in Polynomial Time [article]

Surbhi Goel, Varun Kanade, Adam Klivans, Justin Thaler
2016 arXiv   pre-print
These results are in contrast to known efficient algorithms for reliably learning linear threshold functions, where ϵ must be Ω(1) and strong assumptions are required on the marginal distribution.  ...  of low-weight polynomials on the unit sphere.  ...  The authors are grateful to Sanjeev Arora and Roi Livni for helpful feedback and useful discussions on this work.  ... 
arXiv:1611.10258v1 fatcat:i73nvsnx7begzambyk7k3347om

Embedding Hard Learning Problems Into Gaussian Space

Adam Klivans, Pravesh Kothari, Marc Herbstritt
2014 International Workshop on Approximation Algorithms for Combinatorial Optimization  
We give the first representation-independent hardness result for agnostically learning halfspaces with respect to the Gaussian distribution.  ...  We also show that the problem of agnostically learning sparse polynomials with respect to the Gaussian distribution in polynomial time is as hard as PAC learning DNFs on the uniform distribution in polynomial  ...  We thank Chengang Wu for numerous discussions during the preliminary stages of this work. We thank the anonymous reviewers for pointing out the typos in a previous version of this paper.  ... 
doi:10.4230/lipics.approx-random.2014.793 dblp:conf/approx/KlivansK14 fatcat:uogqzy45fbfjjfueqtqmd3dwc4

Hardness of Agnostically Learning Halfspaces from Worst-Case Lattice Problems [article]

Stefan Tiegel
2022 arXiv   pre-print
We show hardness of improperly learning halfspaces in the agnostic model based on worst-case lattice problems, e.g., approximating shortest vectors within polynomial factors.  ...  Our work gives the first hardness for this problem based on a worst-case complexity assumption.  ...  We will show hardness by showing that a certain low-degree polynomial threshold function is hard to learn.  ... 
arXiv:2207.14030v1 fatcat:5nfuqvt5bbcm5f7bsearuxkhey

The Optimality of Polynomial Regression for Agnostic Learning under Gaussian Marginals [article]

Ilias Diakonikolas, Daniel M. Kane, Thanasis Pittas, Nikos Zarifis
2021 arXiv   pre-print
Using this characterization along with additional analytic tools, we obtain optimal SQ lower bounds for agnostically learning linear threshold functions and the first non-trivial SQ lower bounds for polynomial  ...  related to the polynomial degree required to approximate any function from the class in L^1-norm.  ...  Using analytic techniques, we establish explicit lower bounds on the L 1 polynomial approximation degree for three fundamental concept classes: Linear Threshold Functions (LTFs), Polynomial Threshold Functions  ... 
arXiv:2102.04401v1 fatcat:7xyqlzilezadjbu3cjmd4et6zy

Noise in Classification [article]

Maria-Florina Balcan, Nika Haghtalab
2020 arXiv   pre-print
We discuss approaches for dealing with these negative results by exploiting natural assumptions on the data-generating process.  ...  This chapter considers the computational and statistical aspects of learning linear thresholds in presence of noise.  ...  Importantly, this result establishes that a low-degree polynomial threshold approximates sign(·) in expectation.  ... 
arXiv:2010.05080v2 fatcat:pastdiskivby5ak5yl74t3acti
« Previous Showing results 1 — 15 out of 1,793 results