841,663 Hits in 4.4 sec

Hypothesis Set Stability and Generalization [article]

Dylan J. Foster and Spencer Greenberg and Satyen Kale and Haipeng Luo and Mehryar Mohri and Karthik Sridharan
2020 arXiv   pre-print
Our main result is a generalization bound for data-dependent hypothesis sets expressed in terms of a notion of hypothesis set stability and a notion of Rademacher complexity for data-dependent hypothesis  ...  We present a study of generalization for data-dependent hypothesis sets.  ...  The work of SG and MM was partly supported by NSF CCF-1535987, NSF IIS-1618662, and a Google Research Award. KS would like to acknowledge NSF CAREER Award 1750575 and Sloan Research Fellowship.  ... 
arXiv:1904.04755v3 fatcat:q5g4q7mwnjdpllkhtbdu2lvgga

Stability of decision trees and logistic regression [article]

Nino Arsov, Martin Pavlovski, Ljupco Kocarev
2019 arXiv   pre-print
To that end, in this paper, we derive two stability notions for decision trees and logistic regression: hypothesis and pointwise hypothesis stability.  ...  to optimize the stability of logistic regression, and hence decrease its generalization error.  ...  Then the hypothesis stability upper bound on the generalization error of decision trees for any δ ∈ (0, 1) and any m ≥ 1 is Corollary 4 . 3 ( 43 Hypothesis and pointwise hypothesis stability generalization  ... 
arXiv:1903.00816v1 fatcat:c5pauiymujdxrifh7irhz5lc5m

Almost-everywhere algorithmic stability and generalization error [article]

Samuel Kutin, Partha Niyogi
2012 arXiv   pre-print
We introduce the new notion of training stability of a learning algorithm and show that, in a general setting, it is sufficient for good bounds on generalization error.  ...  In the PAC setting, training stability is both necessary and sufficient for learnability.\ The approach based on training stability makes no reference to VC dimension or VC entropy.  ...  We need this result to ex tend Bousquet and Elisseeff's argument [I] that uniform hypothesis stability gives good bounds on generalization error to weaker notions of stability.  ... 
arXiv:1301.0579v1 fatcat:fkamovnxeze2rl4b6hjpbpvfa4

Stacking and stability [article]

Nino Arsov, Martin Pavlovski, Ljupco Kocarev
2019 arXiv   pre-print
Moreover, in bag-stacking and dag-stacking, the hypothesis stability depends on the sampling strategy used to generate the training set replicates.  ...  We show that the hypothesis stability of stacking is a product of the hypothesis stability of each of the base models and the combiner.  ...  This aspect allowed us to formally study the performance of stacking by analyzing its hypothesis stability and establishing a connection to bag-stacking and dag-stacking.  ... 
arXiv:1901.09134v1 fatcat:i2wa2o2azzgblhegcuvmrxnt4a

General conditions for predictivity in learning theory

Tomaso Poggio, Ryan Rifkin, Sayan Mukherjee, Partha Niyogi
2004 Nature  
For the case of more general algorithms, we note that any learning algorithm is a map L from data sets to hypothesis functions.  ...  Here we provide conditions for generalization in terms of a precise stability property of the learning process: when the training set is perturbed by deleting one example, the learned hypothesis does not  ...  Correspondence and requests for materials should be addressed to T.P. ( letters to nature  ... 
doi:10.1038/nature02341 pmid:15042089 fatcat:2fcm5kscmrckdfqkxl4rofqxuu

Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization

Sayan Mukherjee, Partha Niyogi, Tomaso Poggio, Ryan Rifkin
2006 Advances in Computational Mathematics  
Thus LOO stability is a weak form of stability that represents a sufficient condition for generalization for symmetric learning algorithms while subsuming the classical conditions for consistency of ERM  ...  it and, (b) necessary and sufficient for consistency of ERM.  ...  We are also grateful to Massi (and Theos).  ... 
doi:10.1007/s10444-004-7634-z fatcat:64426vw54fd5vn2thn6apemeui

Pharmacophore Models of Paclitaxel- and Epothilone-Based Microtubule Stabilizing Agents

Sangbae Lee, Yuno Lee, James M. Briggs, Keun Woo Lee
2013 Bulletin of the Korean Chemical Society (Print)  
On the basis of biological activities of the training sets, five-and four-feature pharmacophore hypotheses were generated in the epothilone and paclitaxel series.  ...  The validation of generated hypotheses was achieved by using twelve epothilones and ten paclitaxels, respectively, which are not in the training sets.  ...  These results will be used to construct training sets and test sets and to generate their pharmacophore models. Pharmacophore Generation for the Epothilone and Paclitaxel Derivatives.  ... 
doi:10.5012/bkcs.2013.34.7.1972 fatcat:slmjpzfyrrfnxazmyfimbl4mly

The Survival of the Smallest: Stability Conditions for the Cultural Evolution of Compositional Language [chapter]

Henry Brighton, Simon Kirby
2001 Lecture Notes in Computer Science  
Our experiments show that compositional syntax is most likely to occur under two conditions specific to hominids: (i) A complex meaning space structure, and (ii) the poverty of the stimulus.  ...  Before detailing our analysis, we set the scene by discussing the iterated learning model and the role of stability. Iterated Learning.  ...  One generation of the iterated learning model involves a single agent observing a set of meaning/signal pairs produced by the previous generation.  ... 
doi:10.1007/3-540-44811-x_67 fatcat:k62t7dmiwbaqvkabykrio5fus4

Learning theory: Past performance and future results

Carlo Tomasi
2004 Nature  
The simplicity and generality of the stability criterion promises practical utility.  ...  In other words, can useful measures of stability and generalization be estimated from finite training samples? And is it feasible to develop statistical confidence tests for them?  ... 
doi:10.1038/428378a pmid:15042073 fatcat:vjokuwfq3fh7jnsp4d3mg4xary

Stability Analysis and Learning Bounds for Transductive Regression Algorithms [article]

Corinna Cortes, Mehryar Mohri, Dmitry Pechyony, Ashish Rastogi
2009 arXiv   pre-print
This paper uses the notion of algorithmic stability to derive novel generalization bounds for several families of transductive regression algorithms, both by using convexity and closed-form solutions.  ...  Our analysis helps compare the stability of these algorithms. It also shows that a number of widely used transductive regression algorithms are in fact unstable.  ...  General transduction stability bounds Stability-based generalization bounds in the inductive setting are derived using McDiarmid's inequality (McDiarmid, 1989) .  ... 
arXiv:0904.0814v1 fatcat:4b5iv36l6bfg5leja3iakmrezi

Experimental learning of quantum states

Andrea Rocchetto, Scott Aaronson, Simone Severini, Gonzalo Carvacho, Davide Poderini, Iris Agresti, Marco Bentivegna, Fabio Sciarrino
2019 Science Advances  
This scaling limits our ability to characterize and simulate the evolution of arbitrary states to systems, with no more than a few qubits.  ...  provide the first experimental demonstration that quantum states can be "probably approximately learned" with access to a number of copies of the state that scales linearly with the number of qubits, and  ...  The learning algorithm takes as input the training set and generates a hypothesis h ∈ H that approximates f. The PAC model makes use of two approximation parameters, e and d.  ... 
doi:10.1126/sciadv.aau1946 fatcat:jdmxn6zfqbbfrexvhrj2fodhmu

Approximating Semigroups and the Consistency of Difference Schemes

Gilbert Strang
1969 Proceedings of the American Mathematical Society  
Therefore, if fC ßi (ekL _ j\ L-\e>kLf approaches zero, and convergence holds on the dense set ft. Then stability produces convergence for all /.  ...  In the discrete case, Lax replaced convergence on a dense subset by the following hypothesis of consistency: For each T>0, there is a set Û dense in D(L), and therefore dense in (B, such that for /£(£  ... 
doi:10.2307/2035948 fatcat:ztp3tfypazcuje755qt5huigje

Approximating semigroups and the consistency of difference schemes

Gilbert Strang
1969 Proceedings of the American Mathematical Society  
Therefore, if fC ßi (ekL _ j\ L-\e>kLf approaches zero, and convergence holds on the dense set ft. Then stability produces convergence for all /.  ...  In the discrete case, Lax replaced convergence on a dense subset by the following hypothesis of consistency: For each T>0, there is a set Û dense in D(L), and therefore dense in (B, such that for /£(£  ... 
doi:10.1090/s0002-9939-1969-0233239-1 fatcat:u6fyfjzcxzdy3npc2zwfj7br2e

Control system synthesis through inductive learning of Boolean concepts

1995 IEEE Control Systems  
In certain cases, however, it is desirable to identify the set of all controllers which ensure that the controlled plant satisfies a control property such as Lyapunov stability, robust stability, or robust  ...  The first example examined in this article uses concept learning to identify the set of stabilizing controllers for certain classes of linear time-invariant plants.  ...  Denote the ith hypothesis generated by the algorithm as hi(k) and let E(Hi, ki) be its associated ellipsoidal Set.  ... 
doi:10.1109/37.387614 fatcat:ouggalpd6nf57jxwjblrhc3ihy

Syllabification in Moroccan Arabic: evidence from patterns of temporal stability in articulation*

Jason Shaw, Adamantios I. Gafos, Philip Hoole, Chakir Zeroual
2009 Phonology  
Overall, the paper provides support for the broad hypothesis that syllable structure is reflected in patterns of temporal stability and contributes analytical tools to evaluate competing theories on the  ...  Beyond this specific result for Moroccan Arabic, the model reveals the range of validity of certain stability-based indexes of syllable structure and generates predictions that allow evaluation of a syllabic  ...  Simulations The model described above was used to generate sets of 30 tokens. Each set contained ten instances of each #CV, #CCV and #CCCV word.  ... 
doi:10.1017/s0952675709001754 fatcat:pkiokbfkpvetfj7as45lpd2ukm
« Previous Showing results 1 — 15 out of 841,663 results