Filters








1,812 Hits in 3.7 sec

Learning with Instance-Dependent Label Noise: A Sample Sieve Approach [article]

Hao Cheng, Zhaowei Zhu, Xingyu Li, Yifei Gong, Xing Sun, Yang Liu
2021 arXiv   pre-print
Therefore, providing theoretically rigorous solutions for learning with instance-dependent label noise remains a challenge.  ...  We demonstrate the performance of CORES^2 on CIFAR10 and CIFAR100 datasets with synthetic instance-dependent label noise and Clothing1M with real-world human noise.  ...  A second-order approach to learning with instance- dependent label noise. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2021a.  ... 
arXiv:2010.02347v2 fatcat:zd7uegcxevhblotjhptqwc6dde

A Second-Order Approach to Learning with Instance-Dependent Label Noise [article]

Zhaowei Zhu, Tongliang Liu, Yang Liu
2021 arXiv   pre-print
Experiments on CIFAR10 and CIFAR100 with synthetic instance-dependent label noise and Clothing1M with real-world human label noise verify our approach.  ...  to a new problem with only class-dependent label noise.  ...  Learning with instance- dependent label noise: A sample sieve approach.  ... 
arXiv:2012.11854v2 fatcat:en3qcug6s5c67kg3bwvsjtmu5y

Instance-Dependent Label-Noise Learning with Manifold-Regularized Transition Matrix Estimation [article]

De Cheng, Tongliang Liu, Yixiong Ning, Nannan Wang, Bo Han, Gang Niu, Xinbo Gao, Masashi Sugiyama
2022 arXiv   pre-print
Experimental evaluations on four synthetic and two real-world datasets demonstrate that our method is superior to state-of-the-art approaches for label-noise learning under the challenging IDN.  ...  However, it is very challenging to estimate the transition matrix T(x), where x denotes the instance, because it is unidentifiable under the instance-dependent noise(IDN).  ...  The proposed instance-dependent label-noise learning framework.  ... 
arXiv:2206.02791v1 fatcat:wf7sv2noubb3dlvq2izyd7av5y

Centrality and Consistency: Two-Stage Clean Samples Identification for Learning with Instance-Dependent Noisy Labels [article]

Ganlong Zhao, Guanbin Li, Yipeng Qin, Feng Liu, Yizhou Yu
2022 arXiv   pre-print
Second, for the remaining clean samples that are close to the ground truth class boundary (usually mixed with the samples with instance-dependent noises), we propose a novel consistency-based classification  ...  While in practice, the real-world noise patterns are usually more fine-grained as instance-dependent ones, which poses a big challenge, especially in the presence of inter-class imbalance.  ...  After identifying all clean samples, we follow DivideMix [17] and implement the learning with instance-dependent noisy labels as a semi-supervised learning problem that takes the clean samples as labeled  ... 
arXiv:2207.14476v1 fatcat:3vcrihvlofeuflurtz6s5xpbq4

Learning with queries corrupted by classification noise

Jeffrey Jackson, Eli Shamir, Clara Shwartzman
1999 Discrete Applied Mathematics  
We apply the general analysis to get a noise-robust version of Jackson's Harmonic Sieve, which learns DNF under the uniform distribution.  ...  Kearns introduced the "statistical query" (SQ) model as a general method for producing learning algorithms which are robust against classification noise.  ...  The variance issue of the noise comes up when (3.5) is approximated by a sample with noise-corrupted labels.  ... 
doi:10.1016/s0166-218x(99)00045-1 fatcat:x7tc74s6z5autl3ebi5t3lgdfi

CANDLE: Classification And Noise Detection With Local Embedding Approximations

Erik Thordsen, Erich Schubert
2021 Lernen, Wissen, Daten, Analysen  
In this paper, we propose a combination of both tasks that is based on a score of how close a sample is to the manifold spun by the training data.  ...  The machine learning tasks of supervised classification and unsupervised noise detection are commonly performed separately.  ...  for all, non-noise/undecided-labeled, noise-labeled, and undecided-labeled samples of the MNIST-PCA test set.  ... 
dblp:conf/lwa/ThordsenS21 fatcat:rmgxzpmmrbdg3oo4pduiya7s2u

Uncertain Classification of Variable Stars: Handling Observational GAPS and Noise

Nicolás Castro, Pavlos Protopapas, Karim Pichara
2017 Astronomical Journal  
Finally a bagging approach is used to improve the overall performance of the classification.  ...  Our method uses Gaussian Process Regression to form a probabilistic model of each lightcurve's observations. Then, based on this model, bootstrapped samples of the time series features are generated.  ...  A white noise kernel with standard deviation equal to a randomly chosen value between zero and a fixed percentage of the amplitude is then added to each feature of each instance.  ... 
doi:10.3847/1538-3881/aa9ab8 fatcat:gevcylte6za7bhi7ebrlkn6ogu

Detecting Corrupted Labels Without Training a Model to Predict [article]

Zhaowei Zhu, Zihao Dong, Yang Liu
2022 arXiv   pre-print
The second one is a ranking-based approach that scores each instance and filters out a guaranteed number of instances that are likely to be corrupted.  ...  Experiments with both synthetic and real-world label noise demonstrate our training-free solutions consistently and significantly improve most of the training-based baselines.  ...  for general instance-dependent label noise with heterogeneous noise rates (Cheng et al., 2021a) .  ... 
arXiv:2110.06283v3 fatcat:776xrs47n5bydjzbqi6dugl4ke

Artificial night light and anthropogenic noise interact to influence bird abundance over a continental scale

Ashley A. Wilson, Mark A. Ditmer, Jesse R. Barber, Neil H. Carter, Eliot T. Miller, Luke P. Tyrrell, Clinton D. Francis
2021 Global Change Biology  
Here, we had three aims: (1) to investigate species-specific responses to light, noise, and the interaction between the two using a spatially explicit approach to model changes in abundance of 140 prevalent  ...  We found species that responded to noise exposure generally decreased in abundance, and the additional presence of light interacted synergistically with noise to exacerbate its negative effects.  ...  Parameter estimates from each approach were nearly identical for species with smaller sample sizes, but tended to diverge more for those with larger sample sizes (Table S2 ).  ... 
doi:10.1111/gcb.15663 pmid:34111313 fatcat:5htxzbmixbddhhs7tvne3j3e7y

Sifting Common Information from Many Variables

Greg Ver Steeg, Shuyang Gao, Kyle Reing, Aram Galstyan
2017 Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence  
This scalable approach allows us to demonstrate the usefulness of common information in high-dimensional learning problems.The sieve outperforms standard methods on dimensionality reduction tasks, solves  ...  a blind source separation problem that cannot be solved with ICA, and accurately recovers structure in brain imaging data.  ...  If we have N samples and n variables, then we calculate labels for each data point, y = w · x, which amounts to N dot products of vectors with length n.  ... 
doi:10.24963/ijcai.2017/402 dblp:conf/ijcai/SteegGRG17 fatcat:p5n6zezb7rcv5ityoiswdqk2iq

The Information Sieve [article]

Greg Ver Steeg, Aram Galstyan
2016 arXiv   pre-print
Ultimately, we are left with a set of latent factors explaining all the dependence in the original data and remainder information consisting of independent noise.  ...  Each layer of the sieve recovers a single latent factor that is maximally informative about multivariate dependence in the data.  ...  For each sample, we start with a random probabilistic label.  ... 
arXiv:1507.02284v3 fatcat:nupzcq4ydve3tdtdewspmbgsia

Sifting Common Information from Many Variables [article]

Greg Ver Steeg, Shuyang Gao, Kyle Reing, Aram Galstyan
2017 arXiv   pre-print
This scalable approach allows us to demonstrate the usefulness of common information in high-dimensional learning problems.  ...  The sieve outperforms standard methods on dimensionality reduction tasks, solves a blind source separation problem that cannot be solved with ICA, and accurately recovers structure in brain imaging data  ...  If we have N samples and n variables, then we calculate labels for each data point, y = w · x, which amounts to N dot products of vectors with length n.  ... 
arXiv:1606.02307v4 fatcat:cdtah57cnfafrky4cbxq5twznq

Active Learning for Deep Neural Networks on Edge Devices [article]

Yuya Senzaki, Christian Hamelain
2021 arXiv   pre-print
Although updating a model with real incoming data is ideal, using all of them is not always feasible due to limits, such as labeling and communication costs.  ...  We evaluate our approach on both classification and object detection tasks in a practical setting to simulate a real-life scenario.  ...  Related work Active learning Active learning aims to train a model with few labeled samples by selecting data to label from a (large) pool of unlabeled data [36] .  ... 
arXiv:2106.10836v1 fatcat:dqozjf23k5bcblzr65qb3yqtha

Elucidating Noisy Data via Uncertainty-Aware Robust Learning [article]

Jeongeun Park, Seungyoun Shin, Sangheum Hwang, Sungjoon Choi
2021 arXiv   pre-print
Our proposed method can not only successfully learn the clean target distribution from a dirty dataset but also can estimate the underlying noise pattern.  ...  Robust learning methods aim to learn a clean target distribution from noisy and corrupted training data where a specific corruption pattern is often assumed a priori.  ...  to as an instance-dependent noise (IDN) learning problem [6] .  ... 
arXiv:2111.01632v1 fatcat:sdzqsj5wdne27d4ngjeukmpyfy

Learning-based Multi-Sieve Co-reference Resolution with Knowledge

Lev-Arie Ratinov, Dan Roth
2012 Conference on Empirical Methods in Natural Language Processing  
To maximize the utility of the injected knowledge, we deploy a learningbased multi-sieve approach and develop novel entity-based features.  ...  . * We thank Nicholas Rizzolo and Kai Wei Chang for their invaluable help with modifying the baseline co-reference system. We thank the anonymous EMNLP reviewers for constructive comments.  ...  ., 2010) into a multi-sieve machine learning framework. We show that training sieve-specific models significantly increases the performance on most intermediate sievesieves.  ... 
dblp:conf/emnlp/RatinovR12 fatcat:4f4z6rtb7naqhnuedx3d6am474
« Previous Showing results 1 — 15 out of 1,812 results