Filters








28,828 Hits in 2.9 sec

Polynomial regression under arbitrary product distributions

Eric Blais, Ryan O'Donnell, Karl Wimmer
2010 Machine Learning  
They showed that the L 1 polynomial regression algorithm yields agnostic (tolerant to arbitrary noise) learning algorithms with respect to the class of threshold functions-under certain restricted instance  ...  We also extend these results to learning under mixtures of product distributions.  ...  As a consequence, polynomial regression agnostically learns with respect to C under arbitrary product distributions in time n (O(log(s/ ))) c−1 / 2 .  ... 
doi:10.1007/s10994-010-5179-6 fatcat:3y7pokynffdv7ftr4gkvcovjly

Polynomial regression under arbitrary product distributions

Eric Blais, Ryan O'Donnell, Karl Wimmer
2018
They showed that the L1 polynomial regression algorithm yields agnostic (tolerant to arbitrary noise) learning algorithms with respect to the class of threshold functions — under certain restricted instance  ...  We also extend these results to learning under mixtures of product distributions.  ...  This leads to an agnostic learning result for C under arbitrary product distributions which is the same as the one would get for C under the uniform distribution on {0, 1} n , except for an extra factor  ... 
doi:10.1184/r1/6608468 fatcat:ivbexowrhbcb3dd4y7b6gj4mje

Weighted Polynomial Approximations: Limits for Learning and Pseudorandomness [article]

Mark Bun, Thomas Steinke
2014 arXiv   pre-print
Firstly, the polynomial regression algorithm of Kalai et al. (SIAM J.  ...  The power of this algorithm relies on the fact that under log-concave distributions, halfspaces can be approximated arbitrarily well by low-degree polynomials.  ...  [DFT + 14] shows that, at least for product distributions on the hypercube, polynomials yield the best basis for L 1 regression.  ... 
arXiv:1412.2457v1 fatcat:xtydl6gpobhbjprfa3lglv25bq

Agnostically Learning Halfspaces

Adam Tauman Kalai, Adam R. Klivans, Yishay Mansour, Rocco A. Servedio
2008 SIAM journal on computing (Print)  
The new algorithm, essentially L 1 polynomial regression, is a noise-tolerant arbitrary-distribution generalization of the "low-degree" Fourier algorithm of Linial, Mansour, & Nisan.  ...  −1, 1} n or the unit sphere in R n , as well as under any log-concave distribution over R n .  ...  et al. can be viewed as an algorithm for performing L 2 polynomial regression under the uniform distribution on {−1, 1} n .  ... 
doi:10.1137/060649057 fatcat:ipl2otjwwfasvcmisxhafi65ny

Page 1169 of Genetics Vol. 176, Issue 2 [page]

2007 Genetics  
normal distribution.  ...  The curves are definitely not logistic, which explains why animal breeders do not use logistic regression to fit milk production curves.  ... 

Special functions and characterizations of probability distributions by zero regression properties

Barbara Heller
1983 Journal of Multivariate Analysis  
Kallianpur Characterizations of the binomial, negative binomial, gamma, Poisson, and normal distributions are obtained by the property of zero regression of certain polynomial statistics of arbitrary degree  ...  In each case, the equations which express zero regression are derived from the recurrence relations of a set of special functions.  ...  Therefore, by starting with those recurrence relations, we can find polynomial statistics of arbitrary degree which have zero regression on L when the underlying probability distribution is binomial.  ... 
doi:10.1016/0047-259x(83)90022-2 fatcat:vzjfwf3ffjesfcanozey5n5c4q

Page 8096 of Mathematical Reviews Vol. , Issue 2004j [page]

2004 Mathematical Reviews  
, PA) Asymptotics for polynomial spline regression under weak conditions.  ...  By restricting attention to spaces built by polynomial splines and their tensor products the author can now verify the condition (iv) under weaker conditions on the growing rate of the dimension.  ... 

Partialed products are interactions; partialed powers are curve components

Jacob Cohen
1978 Psychological bulletin  
score regression coefficients are simply rescaled.  ...  and anxiety about their use as independent variables in the representation of interactions and curve components in general multiple regression/correlation analysis.  ...  What are not invariant under linear transformation when the three IVs are simultaneously regressed on F are the semipartial and partial correlations and regression coefficients of X and Z, and their /  ... 
doi:10.1037//0033-2909.85.4.858 fatcat:uldkda5pcbbtlbpxl43eh4xhma

Partialed products are interactions; partialed powers are curve components

Jacob Cohen
1978 Psychological bulletin  
score regression coefficients are simply rescaled.  ...  and anxiety about their use as independent variables in the representation of interactions and curve components in general multiple regression/correlation analysis.  ...  What are not invariant under linear transformation when the three IVs are simultaneously regressed on F are the semipartial and partial correlations and regression coefficients of X and Z, and their /  ... 
doi:10.1037/0033-2909.85.4.858 fatcat:s2n426sjxvbjrk5mgkkz2eubku

Polynomial representations for response surface modeling [chapter]

Norman R. Draper, Friedrich Pukelsheim
1998 Lecture Notes-Monograph Series  
We show that under this partial ordering there is a constructive path of design improvement.  ...  We overview some of the recent work on design optimality for response surface models and polynomial regression. However, our emphasis is not on scalar optimality criteria.  ...  This greatly facilitates our calculations when we now apply Kronecker products to response surface models. 3. Polynomial regression.  ... 
doi:10.1214/lnms/1215456198 fatcat:yizhds73yjctjpchka2o3il4ia

Regression for sets of polynomial equations [article]

Franz Johannes Király, Paul von Bünau, Jan Saputra Müller, Duncan Blythe, Frank Meinecke, Klaus-Robert Müller
2013 arXiv   pre-print
We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type.  ...  Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations.  ...  D do 2: that is, sets of polynomials closed under addition in the set, and under multiplication with arbitrary polynomials 4 F. J. KIRÁLY, P. VON BÜNAU, J. S. MÜLLER, D. A. J. BLYTHE, F. C.  ... 
arXiv:1110.4531v4 fatcat:hz5cboayqjeybileja23sald2e

The moments of products of quadratic forms in normal variables

Jan R. Magnus
1978 Statistica neerlandica (Print)  
The expectation of the product of an arbitrary number of quadratic forms in normally distributed variables is derived.  ...  Note that this formula can be used to compute all the moments of a product of an arbitrary number of quadratic forms.  ...  ., x, from an arbitrary distribution. Then, is a quadratic form in x; The n-vector 1, consists of ones only.  ... 
doi:10.1111/j.1467-9574.1978.tb01399.x fatcat:5i2egronjnhzpfjknyvcya6i5m

Model selection of polynomial kernel regression [article]

Shaobo Lin, Xingping Sun, Zongben Xu, Jinshan Zeng
2015 arXiv   pre-print
Polynomial kernel regression is one of the standard and state-of-the-art learning strategies.  ...  On one hand, based on the worst-case learning rate analysis, we show that the regularization term in polynomial kernel regression is not necessary.  ...  Under this circumstance, the computational burden of polynomial kernel regression can be reduced and much less than that of Gaussian kernel regression (See Table 4 in Section 5).  ... 
arXiv:1503.02143v1 fatcat:tuawhsgplvhkpexbrxiimf62hi

Submodular Functions Are Noise Stable [article]

Mahdi Cheraghchi, Adam Klivans, Pravesh Kothari, Homin K. Lee
2011 arXiv   pre-print
As a consequence, we obtain a polynomial-time learning algorithm for this class with respect to any product distribution on {-1,1}^n (for any constant accuracy parameter ϵ).  ...  In Section 3.2 we will prove Theorem 3 in the general setting of arbitrary product distributions.  ...  n is a product distribution.  ... 
arXiv:1106.0518v2 fatcat:yvphyxosvva3ff4lpauhgyw4pu

On the residuals of autoregressive processes and polynomial regression

R.J. Kulperger
1985 Stochastic Processes and their Applications  
The residual processes of a stationary AR(p) process and of polynomial regression are considered. The residuals are obtained from ordinary least squares fitting.  ...  In the polynomial case, they converge to generalized Brownian bridges. Other uses of the residuals are considered.  ...  Polynomial regression residual process Consider the polynomial regression k=O or k+j+l' j=0,1,. B~")-~, ( B, -tB1) -3t( t-1) ( B, -2 Io' Bs ds ) -5t(t-1)(2t-1) Bl+6 B~ds-12 sB, ds .  ... 
doi:10.1016/0304-4149(85)90380-1 fatcat:bvzbnpwutrb7df3kajp3shtoru
« Previous Showing results 1 — 15 out of 28,828 results