A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Beyond Synthetic Noise: Deep Learning on Controlled Noisy Labels
[article]
2020
arXiv
pre-print
Performing controlled experiments on noisy data is essential in understanding deep learning across noise levels. ...
Due to the lack of suitable datasets, previous research has only examined deep learning on controlled synthetic label noise, and real-world label noise has never been studied in a controlled setting. ...
The data labeling is supported by the Google Cloud labeling service. ...
arXiv:1911.09781v3
fatcat:l2yhsf4oebcs3b7wkxl2uamcfu
Beyond Photo Realism for Domain Adaptation from Synthetic Data
[article]
2019
arXiv
pre-print
We accomplish this by learning a generative model to perform shading of synthetic geometry conditioned on a "g-buffer" representation of the scene to render, as well as a low sample Monte Carlo rendered ...
As synthetic imagery is used more frequently in training deep models, it is important to understand how different synthesis techniques impact the performance of such models. ...
Introduction Applying deep learning to supervised computer vision tasks commonly requires large labeled datasets [5, 14] , which can be time consuming, expensive or impractical to collect. ...
arXiv:1909.01960v1
fatcat:cnab4fotb5fyjhjboitdx6uppe
An Instance-Dependent Simulation Framework for Learning with Label Noise
[article]
2021
arXiv
pre-print
We also benchmark several existing algorithms for learning with noisy labels and compare their behavior on our synthetic datasets and on the datasets with independent random label noise. ...
Equipped with controllable label noise, we study the negative impact of noisy labels across a few practical settings to understand when label noise is more problematic. ...
Impact of Label Noise on Deep Learning Models With the instance-dependent synthetic datasets with noisy labels, our next step is to study the impact of noisy labels on deep learning models. ...
arXiv:2107.11413v4
fatcat:djqsonbbend53er4n4zkzfisxu
Open-set Label Noise Can Improve Robustness Against Inherent Label Noise
[article]
2021
arXiv
pre-print
Learning with noisy labels is a practically challenging problem in weakly supervised learning. ...
achieves significant improvement on Out-of-Distribution detection tasks even in the label noise setting. ...
deep learning. ...
arXiv:2106.10891v3
fatcat:2vjoezev7za4bbmzz5ipqakp5y
Noise-Resistant Deep Metric Learning with Probabilistic Instance Filtering
[article]
2021
arXiv
pre-print
Previous research mostly focuses on enhancing classification models against noisy labels, while the robustness of deep metric learning (DML) against noisy labels remains less well-explored. ...
Extensive experiments on both synthetic and real-world noisy dataset show that the proposed approach achieves up to 8.37% higher Precision@1 compared with the best performing state-of-the-art baseline ...
Yang, “Beyond synthetic noise: Deep
is set to 1. ...
arXiv:2108.01431v2
fatcat:vz5d4iqshvds7iaqohl7js4ieq
Noisy Concurrent Training for Efficient Learning under Label Noise
[article]
2020
arXiv
pre-print
Deep neural networks (DNNs) fail to learn effectively under label noise and have been shown to memorize random labels which affect their generalization performance. ...
We demonstrate the effectiveness of our approach on both synthetic and real-world noisy benchmark datasets. ...
We showed the effectiveness of our method on multiple synthetic noisy datasets with varying degrees and types of label noise as well as realworld noisy datasets. ...
arXiv:2009.08325v1
fatcat:gyablpp6ujhmzdifkffxacukti
A Second-Order Approach to Learning with Instance-Dependent Label Noise
[article]
2021
arXiv
pre-print
Experiments on CIFAR10 and CIFAR100 with synthetic instance-dependent label noise and Clothing1M with real-world human label noise verify our approach. ...
The presence of label noise often misleads the training of deep neural networks. ...
Beyond synthetic noise: Deep learning on controlled
noisy labels. In Proceedings of the 37th International
Conference on Machine Learning, volume 119, pages
4804-4815. ...
arXiv:2012.11854v2
fatcat:en3qcug6s5c67kg3bwvsjtmu5y
IMAE for Noise-Robust Learning: Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude's Variance Matters
[article]
2020
arXiv
pre-print
We prove IMAE's effectiveness using extensive experiments: image classification under clean labels, synthetic label noise, and real-world unknown noise. ...
In this work, we study robust deep learning against abnormal training data from the perspective of example weighting built in empirical loss functions, i.e., gradient magnitude with respect to logits, ...
However, it is beyond the scope of this work since we focus on analysing MAE and how to improve MAE here. We will investigate this claim in other loss functions in our future work. ...
arXiv:1903.12141v9
fatcat:h5pbekokxjddvhhpypbcxe52ty
Noise Robust Generative Adversarial Networks
[article]
2020
arXiv
pre-print
As an alternative, we propose a novel family of GANs called noise robust GANs (NR-GANs), which can learn a clean image generator even when training images are noisy. ...
On three benchmark datasets, we demonstrate the effectiveness of NR-GANs in noise robust image generation. Furthermore, we show the applicability of NR-GANs in image denoising. ...
Learning
deep networks from noisy labels with dropout regularization.
In ICDM, 2016. 3
[33] Takuhiro Kaneko and Tatsuya Harada. ...
arXiv:1911.11776v2
fatcat:rfgqwn3wbbebjmaxrq3empt7fu
Correlated Input-Dependent Label Noise in Large-Scale Image Classification
[article]
2021
arXiv
pre-print
Large scale image classification datasets often contain noisy labels. ...
We demonstrate that the learned covariance structure captures known sources of label noise between semantically similar and co-occurring classes. ...
Interna-
tional Conference on Machine Learning (ICML), 2018. 1, 4,
5
[24] Lu Jiang, Di Huang, Mason Liu, and Weilong Yang. Beyond
synthetic noise: Deep learning on controlled noisy labels. ...
arXiv:2105.10305v1
fatcat:r75nlz5ymbdyddwkjurmxhzq5q
Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
[article]
2020
arXiv
pre-print
Learning with noisy labels is a common challenge in supervised learning. ...
Existing approaches often require practitioners to specify noise rates, i.e., a set of parameters controlling the severity of label noises in the problem, and the specifications are either assumed to be ...
Acknowledgement Yang Liu would like to thank Yiling Chen for inspiring early discussions on this problem. ...
arXiv:1910.03231v7
fatcat:eg2ho27mevgppevdn2gsxpc2p4
NAT: Noise-Aware Training for Robust Neural Sequence Labeling
[article]
2020
arXiv
pre-print
To this end, we formulate the noisy sequence labeling problem, where the input may undergo an unknown noising process and propose two Noise-Aware Training (NAT) objectives that improve robustness of sequence ...
labeling performed on perturbed input: Our data augmentation method trains a neural model using a mixture of clean and noisy samples, whereas our stability training algorithm encourages the model to create ...
We use the hyper-parameter η to control the amount of noise to be induced with this method 4 . ...
arXiv:2005.07162v1
fatcat:4endrvdl4zeqhfau2ycepstp5m
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
[article]
2021
arXiv
pre-print
Much of the literature (with several recent exceptions) of learning with noisy labels focuses on the case when the label noise is independent of features. ...
We demonstrate the performance of CORES^2 on CIFAR10 and CIFAR100 datasets with synthetic instance-dependent label noise and Clothing1M with real-world human noise. ...
Deep self-learning from noisy labels. In Proceedings
of the IEEE International Conference on Computer Vision, pp. 5138-5147, 2019. ...
arXiv:2010.02347v2
fatcat:zd7uegcxevhblotjhptqwc6dde
On the Robustness of Intent Classification and Slot Labeling in Goal-oriented Dialog Systems to Real-world Noise
[article]
2021
arXiv
pre-print
Intent Classification (IC) and Slot Labeling (SL) models, which form the basis of dialogue systems, often encounter noisy data in real-word environments. ...
By leveraging cross-noise robustness transfer -- training on one noise type to improve robustness on another noise type -- we design aggregate data-augmentation approaches that increase the model performance ...
Performance on Noised Test Sets For each noise type, we design a control set that contains only the clean version of the noised utterances in the treatment set. ...
arXiv:2104.07149v2
fatcat:fj3tvh6b6bgx3bbq5bq4rckzam
Physics-based Noise Modeling for Extreme Low-light Photography
[article]
2021
arXiv
pre-print
In this regard, although promising results have been shown recently with deep convolutional neural networks, the success heavily depends on abundant noisy clean image pairs for training, which are tremendously ...
Extensive experiments on multiple low-light denoising datasets -- including a newly collected one in this work covering various devices -- show that a deep neural network trained with our proposed noise ...
In the modern era, most image denoising algorithms are entirely datadriven, which rely on deep neural networks that implicitly learn the statistical regularities to infer clean images from their noisy ...
arXiv:2108.02158v1
fatcat:fepesyywrbfwfjbkfvhgycpbh4
« Previous
Showing results 1 — 15 out of 4,988 results