Filters








98,596 Hits in 7.4 sec

Sample-efficient strategies for learning in the presence of noise

Nicolò Cesa-Bianchi, Eli Dichterman, Paul Fischer, Eli Shamir, Hans Ulrich Simon
1999 Journal of the ACM  
In this paper, we prove various results about PAC learning in the presence of malicious noise. Our main interest is the sample size behavior of learning algorithms.  ...  We also show that this result cannot be significantly improved in general by presenting efficient learning algorithms for the class of all subsets of d elements and the class of unions of at most d intervals  ...  The authors thank an anonymous reviewer for pointing out the reference [Littlewood 1969 ].  ... 
doi:10.1145/324133.324221 fatcat:5futbfrcgzewjafepxkk3sqjeu

Learning Deviation Payoffs in Simulation-Based Games

Samuel Sokota, Caleb Ho, Bryce Wiedenbeck
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
We give a procedure for iteratively refining the learned model with new data produced by sampling in the neighborhood of each candidate Nash equilibrium.  ...  Our method uses neural networks to learn a mapping from mixed-strategy profiles to deviation payoffs—the expected values of playing pure-strategy deviations from those profiles.  ...  We have shown that deviation payoff learning outperforms the previous best and sets new performance standards for scaling in noise, number of players, number of strategies, and role asymmetries.  ... 
doi:10.1609/aaai.v33i01.33012173 fatcat:afs4ouprvjb2zmhywnzuy4mwou

On-line learning with malicious noise and the closure algorithm [chapter]

Peter Auer, Nicoló Cesa-Bianchi
1994 Lecture Notes in Computer Science  
We investigate a variant of the on-line learning model for classes of {0, 1}-valued functions (concepts) in which the labels of a certain amount of the input instances are corrupted by adversarial noise  ...  Finally, we show how to efficiently turn any algorithm for the on-line noise model into a learning algorithm for the PAC model with malicious noise.  ...  The second author wishes to thank the Institute for Theoretical Computer Science (IGI) at the Graz University of Technology that he visited during the academical year 1993-1994.  ... 
doi:10.1007/3-540-58520-6_67 fatcat:3opzv24ju5a7ppjgahpfmsvde4

Training Strategies for Deep Learning Gravitational-Wave Searches [article]

Marlin B. Schäfer
2021 arXiv   pre-print
Extracting these signals from the instruments' background noise is a complex problem and the computational cost of most current searches depends on the complexity of the source model.  ...  With this alteration we find that the machine learning search retains ≥ 97.5% of the sensitivity of the matched-filter search down to a false-alarm rate of 1 per month.  ...  We found that the particular strategy is of little importance to the eventual performance of the network. It depends a lot more on the presence of sufficiently complex samples in the training set.  ... 
arXiv:2106.03741v1 fatcat:tzwkf5ntlzaqzciau2dwrwn4qe

PAC-Learning with General Class Noise Models [chapter]

Shahin Jabbari, Robert C. Holte, Sandra Zilles
2012 Lecture Notes in Computer Science  
We introduce a framework for class noise, in which most of the known class noise models for the PAC setting can be formulated.  ...  Within this framework, we study properties of noise models that enable learning of concept classes of finite VC-dimension with the Empirical Risk Minimization (ERM) strategy.  ...  This work was supported by the Alberta Innovates Centre for Machine Learning (AICML) and the Natural Sciences and Engineering Research Council of Canada (NSERC).  ... 
doi:10.1007/978-3-642-33347-7_7 fatcat:wcl56tgzrrerdkdko6szxtqvpe

Page 6247 of Mathematical Reviews Vol. , Issue 96j [page]

1996 Mathematical Reviews  
Combined with a result of Simon, we effectively show that the sample complexity of PAC learning in the presence of classification noise is Q(VC(F)/e(1 — 24)? + log(1/d) /e(1 — 2m)*).  ...  Specifically, we show a general lower bound of Q(log(1/d)/e(1 —2)*) on the number of examples required for PAC learning in the presence of classification noise.  ... 

Improved Convolutional Neural Network Based Cooperative Spectrum Sensing For Cognitive Radio

2021 KSII Transactions on Internet and Information Systems  
Spectrum sensing is the crucial step in cognitive applications in which cognitive user detects the presence of primary user (PU) in a particular channel thereby switching to another channel for continuous  ...  The probability of detection is considered as the determining parameter to find the efficiency of the proposed algorithm.  ...  In this proposed technique, probability detection is considered as one of the important parameters for the efficiency of the method.  ... 
doi:10.3837/tiis.2021.06.011 fatcat:wpxcud6ypzenzirpttns7xfage

PAC-learning in the presence of one-sided classification noise

Hans Ulrich Simon
2012 Annals of Mathematics and Artificial Intelligence  
We derive an upper and a lower bound on the sample size needed for PAC-learning a concept class in the presence of one-sided classification noise.  ...  The upper bound is achieved by the strategy "Minimum One-sided Disagreement". It matches the lower bound (which holds for any learning strategy) up to a logarithmic factor.  ...  This strategy is reasonable when learning takes place in the presence of one-sided classification noise.  ... 
doi:10.1007/s10472-012-9325-7 fatcat:klkhocit4rayhmg6xl5y4efq4a

An Online Stochastic Kernel Machine for Robust Signal Classification [article]

Raghu G. Raj
2019 arXiv   pre-print
We present a novel variation of online kernel machines in which we exploit a consensus based optimization mechanism to guide the evolution of decision functions drawn from a reproducing kernel Hilbert  ...  space, which efficiently models the observed stationary process.  ...  Whereas in the former setting, learning is based on a random batch of training samples from which a single hypothesis is formulated for prediction, in the online setting the learning algorithm observes  ... 
arXiv:1905.07686v2 fatcat:ctg7radssndhjhswpjnn2yutye

Ideal observer analysis of crowding and the reduction of crowding through learning

G. J. Sun, S. T. L. Chung, B. S. Tjan
2010 Journal of Vision  
The improvement in equivalent input noise and sampling efficiency persists for at least 6 months. Figure 2 . Experimental design.  ...  Across subjects, the improvement was attributable to either a decrease in crowding-induced equivalent input noise or an increase in sampling efficiency, but seldom both.  ...  Acknowledgments We thank Paul Bulakowski for his comments on an earlier draft of this paper, and Denis Pelli, an anonymous reviewer, and James Elder (the editor) for their extensive and detailed comments  ... 
doi:10.1167/10.5.16 pmid:20616136 pmcid:PMC3096759 fatcat:idaz6aahiva25bog7wfg4by77q

Specification and Simulation of Statistical Query Algorithms for Efficiency and Noise Tolerance

Javed A Aslam, Scott E Decatur
1998 Journal of computer and system sciences (Print)  
We show that the learning algorithms obtained by simulating efficient relative error SQ algorithms both in the absence of noise and in the presence of malicious noise have roughly optimal sample complexity  ...  We also show that the simulation of efficient relative error SQ algorithms in the presence of classification noise yields learning algorithms at least as efficient as those obtained through standard methods  ...  We have shown a non-trivial learning problem for which a simulation in the presence of classification noise yields a roughly optimal sample complexity.  ... 
doi:10.1006/jcss.1997.1558 fatcat:fzb5m5crrvbepo6qlcmw6jjrim

Cognitive Radio Technology [chapter]

2006 Next Generation Wireless Systems and Networks  
Within the field of Cognitive Radio, many different aspects of the system must be considered in order to achieve an optimal system.  ...  Cognitive Radio is a novel method of radio communication which enables more efficient use of the frequency spectrum.  ...  New strategies of cooperation in cognitive radio will be conceived so that this technology can evolve and become viably efficient for the consumer market.  ... 
doi:10.1002/0470024569.ch9 fatcat:33fa7xjqc5g7fhts7xlknpdpqy

Robust Medical Image Classification from Noisy Labeled Data with Global and Local Representation Guided Co-training [article]

Cheng Xue, Lequan Yu, Pengfei Chen, Qi Dou, Pheng-Ann Heng
2022 arXiv   pre-print
In this paper, we propose a novel collaborative training paradigm with global and local representation learning for robust medical image classification from noisy-labeled data to combat the lack of high  ...  We evaluated our proposed robust learning strategy on four public medical image classification datasets with three types of label noise,ie,random noise, computer-generated label noise, and inter-observer  ...  Moreover, the label noises in the medical image are usually instance dependent noise, which makes the estimation of transition matrix even challenging. (2) Sample selection strategy.  ... 
arXiv:2205.04723v1 fatcat:zmumvcntnzdtfbauvt7nba2cyy

Improved Niching and Encoding Strategies for Clustering Noisy Data Sets [chapter]

Olfa Nasraoui, Elizabeth Leon
2004 Lecture Notes in Computer Science  
We formulate requirements for efficient encoding, resistance to noise, and ability to discover the number of clusters automatically.  ...  Clustering is crucial to many applications in pattern recognition, data mining, and machine learning.  ...  Because UNC uses robust weights in its cluster fitness definition, it is less sensitive to the presence of noise.  ... 
doi:10.1007/978-3-540-24855-2_148 fatcat:r73f6l4jyvfjlal6f3e2we4no4

FedNoiL: A Simple Two-Level Sampling Method for Federated Learning with Noisy Labels [article]

Zhuowei Wang, Tianyi Zhou, Guodong Long, Bo Han, Jing Jiang
2022 arXiv   pre-print
Hence, the labels in practice are usually annotated by clients of varying expertise or criteria and thus contain different amounts of noises.  ...  In experiments with homogeneous/heterogeneous data distributions and noise ratios, we observed that direct combinations of SOTA FL methods with SOTA noisy-label learning methods can easily fail but our  ...  in improving the robustness of the global model in the presence of noisy labels.  ... 
arXiv:2205.10110v1 fatcat:fcjeh3p6jnfjbinkpesqpqbgnu
« Previous Showing results 1 — 15 out of 98,596 results