40,029 Hits in 3.6 sec

Exploiting Context for Robustness to Label Noise in Active Learning [article]

Sudipta Paul, Shivkumar Chandrasekaran, B.S. Manjunath, Amit K. Roy-Chowdhury
2020 arXiv   pre-print
Several works in computer vision have demonstrated the effectiveness of active learning for adapting the recognition model when new unlabeled data becomes available.  ...  Towards solving the problems, we propose a noisy label filtering based learning approach where the inter-relationship (context) that is quite common in natural data is utilized to detect the wrong labels  ...  In this work, we propose a Context-aware Noisy Label Detection (CNLD) approach to detect wrong labels and utilize CNLD to formalize an active learning framework to handle the adverse impact of label noise  ... 
arXiv:2010.09066v1 fatcat:dfobzy3z2vcfzpsrywcy4pmsum

Modelling non-stationary noise with spectral factorisation in automatic speech recognition

Antti Hurmalainen, Jort F. Gemmeke, Tuomas Virtanen
2013 Computer Speech and Language  
To adapt the system to varying environments, noise models are acquired from the context, or learnt from the mixture itself without prior information.  ...  This study applies spectral factorisation algorithms and long temporal context for separating speech and noise from mixed signals.  ...  In order to reduce the risk of misclassification due to incorrect or overly strict label associations, we learn the mapping from activations to states by factorising the 200 training utterances not used  ... 
doi:10.1016/j.csl.2012.07.008 fatcat:a6azihuwx5eipmln7ar7wivuaa

Recognizing Scenes by Simulating Implied Social Interaction Networks [chapter]

MaryAnne Fields, Craig Lennon, Christian Lebiere, Michael K. Martin
2015 Lecture Notes in Computer Science  
Exploiting Cognitive Context OBJECTIVE & BENEFITS • Exploit cognitive context to augment bottom-up perceptual approaches • Leverage activation mechanisms in ACT-R to provide contextual expectations  ...  knowledge structures (SHOGs) to cognitive models • Instance-based learning in ACT-R • Global graph properties = scene gist • Local graph properties = exemplars of object in context (scene content  ... 
doi:10.1007/978-3-319-22873-0_32 fatcat:z5na526javgvxj76zutlmqsxqe

Voice activity detection using convolutive non-negative sparse coding

Peng Teng, Yunde Jia
2013 2013 IEEE International Conference on Acoustics, Speech and Signal Processing  
At last, the activity labels is given by decoding a conditional random field (CRF) which is constructed to model the context of an audio signal for VAD.  ...  Our idea is to use noise-robust feature for speech signal detection while noise is reduced away.  ...  to obtain noise-robust bases for representing speech; {D n r } is low-rank and learned from noise signal samples using CNMF so that it can fit noise well with its few bases.  ... 
doi:10.1109/icassp.2013.6639095 dblp:conf/icassp/TengJ13 fatcat:kpphgucbtjcrvegmail2hdeegi

CPRAL: Collaborative Panoptic-Regional Active Learning for Semantic Segmentation [article]

Yu Qiao, Jincheng Zhu, Chengjiang Long, Zeyao Zhang, Yuxin Wang, Zhenjun Du, Xin Yang
2022 arXiv   pre-print
In this paper, we propose a novel Collaborative Panoptic-Regional Active Learning framework (CPRAL) to address the semantic segmentation task.  ...  For a small batch of images initially sampled with pixel-wise annotations, we employ panoptic information to initially select unlabeled samples.  ...  Acknowledgments This work was supported in part by the National Natural Science Foundation of China under Grant 61972067 U1908214, Natural Science Foundation of Liaoning under Grant 20180520032, and the  ... 
arXiv:2112.05975v2 fatcat:22uerhwmere2nhs7qg6rly7mze

Auxiliary Image Regularization for Deep CNNs with Noisy Labels [article]

Samaneh Azadi, Jiashi Feng, Stefanie Jegelka, Trevor Darrell
2016 arXiv   pre-print
Comprehensive experiments on benchmark data sets clearly demonstrate our proposed regularized CNN model is resistant to label noise in training data.  ...  mutual context information among training images and encourages the model to select reliable images to robustify the learning process.  ...  Being able to exploit this rich resource seems promising for learning a deep classification model.  ... 
arXiv:1511.07069v2 fatcat:kuhuhyikrjcvvejec65umkl2xi

Learning with noisy supervision for Spoken Language Understanding

Christian Raymond, Giuseppe Riccardi
2008 Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing  
We show that our noise-robust algorithm could improve the accuracy up to 6% (absolute) depending on the noise level and the labeling cost.  ...  We investigate two alternative noise-robust active learning strategies that are either data-intensive or supervision-intensive.  ...  Acknowledgment We would like to thank Yulan He for sharing with us her ATIS annotated dataset.  ... 
doi:10.1109/icassp.2008.4518778 dblp:conf/icassp/RaymondR08 fatcat:wfijxxw6kzgznmcmk6xyvuykny

Medi-Care AI: Predicting Medications From Billing Codes via Robust Recurrent Neural Networks [article]

Deyin Liu, Lin Wu, Xue Li
2019 arXiv   pre-print
to improved RNNs robustness towards data variability in terms of missing values and multiple errors.  ...  In this paper, we present an effective deep prediction framework based on robust recurrent neural networks (RNNs) to predict the likely therapeutic classes of medications a patient is taking, given a sequence  ...  the underlying RNN in the context of noise injection.  ... 
arXiv:2001.10065v1 fatcat:py6eteeyvvdibllyk2rufk5iqm

Incremental Relabeling for Active Learning with Noisy Crowdsourced Annotations

Liyue Zhao, Gita Sukthankar, Rahul Sukthankar
2011 2011 IEEE Third Int'l Conference on Privacy, Security, Risk and Trust and 2011 IEEE Third Int'l Conference on Social Computing  
We propose an active learning method that is specifically designed to be robust to such noise.  ...  Unfortunately, most active learning strategies are myopic and sensitive to label noise, which leads to poorly trained classifiers.  ...  ACKNOWLEDGMENTS This research was supported in part by DARPA award N10AP20027.  ... 
doi:10.1109/passat/socialcom.2011.193 dblp:conf/socialcom/ZhaoSS11 fatcat:4p5zwipjcvh2jn2vajnftqaevu

Multi-class Multi-annotator Active Learning with Robust Gaussian Process for Visual Recognition

Chengjiang Long, Gang Hua
2015 2015 IEEE International Conference on Computer Vision (ICCV)  
Active learning is an effective way to relieve the tedious work of manual annotation in many applications of visual recognition.  ...  Also, we incorporate the idea of reinforcement learning to actively select both the informative samples and the high-quality annotators, which better explores the trade-off between exploitation and exploration  ...  All these demonstrate that our proposed MARMGPC-ASAA is robust to deal with label noises in the learning progress.  ... 
doi:10.1109/iccv.2015.325 dblp:conf/iccv/LongH15 fatcat:fq7wgh3aevhmtc4grqj7d3yhqm

On the intrinsic robustness to noise of some leading classifiers and symmetric loss function – an empirical evaluation [article]

Hugo Le Baher
2021 arXiv   pre-print
We propose a benchmark to evaluate the natural robustness of different algorithms taken from various paradigms on artificially corrupted datasets, with a focus on noisy labels.  ...  these labels may be weak in quantity, quality or trustworthiness.  ...  In their publication, authors compare the context of label noise to the context of missing values, described in [34] .  ... 
arXiv:2010.13570v5 fatcat:ko4afgeiwbc67hwp4cvatjaqfa

A Machine Learning based Robust Prediction Model for Real-life Mobile Phone Data [article]

Iqbal H. Sarker
2019 arXiv   pre-print
In this paper, we address these issues and present a robust prediction model for real-life mobile phone data of individual users, in order to improve the prediction accuracy of the model.  ...  After that, we employ the most popular rule-based machine learning classification technique, i.e., decision tree, on the noise-free quality dataset to build the prediction model.  ...  Ashad Kabir, Charles Sturt University, Australia for their relevant discussions.  ... 
arXiv:1902.07588v1 fatcat:oylakibpcnad7brktskheytcye

On the Robustness of Monte Carlo Dropout Trained with Noisy Labels [article]

Purvi Goel, Li Chen
2021 arXiv   pre-print
The memorization effect of deep learning hinders its performance to effectively generalize on test set when learning with noisy labels.  ...  deviation on each neuron's activation; 3. network sparsity: investigating the network support of MCDropout in comparison with deterministic neural networks.  ...  Authors in [10, 28, 25, 18] devised robust loss function to achieve a smaller risk for unseen clean data when learning with noisy labels.  ... 
arXiv:2103.12002v1 fatcat:7s5bh2hmhvcttplrdwv7pkc2q4

Towards On-Board Hyperspectral Satellite Image Segmentation: Understanding Robustness of Deep Learning through Simulating Acquisition Conditions

Jakub Nalepa, Michal Myller, Marcin Cwiek, Lukasz Zak, Tomasz Lakota, Lukasz Tulczyjew, Michal Kawulok
2021 Remote Sensing  
Classifying and segmenting such imagery are the pivotal steps in virtually all applications, hence developing new techniques for these tasks is a vital research area.  ...  Here, deep learning has established the current state of the art.  ...  Acknowledgments: We thank Bertrand Le Saux (European Space Agency) for lots of fruitful discussions that helped us improve the work reported in this manuscript.  ... 
doi:10.3390/rs13081532 fatcat:xyzuoxoeozcihcojawouwwqc2a

Learning by active nonlinear diffusion

Mauro Maggioni, ,Department of Mathematics, Department of Applied Mathematics and Statistics, Mathematical Institute of Data Sciences, Institute of Data Intensive Engineering and Science, Johns Hopkins University, Baltimore, MD 21218, USA, James M. Murphy, ,Department of Mathematics, Tufts University, Medford, MA 02155, USA
2019 Foundations of Data Science  
This article proposes an active learning method for high-dimensional data, based on intrinsic data geometries learned through diffusion processes on graphs.  ...  Diffusion distances are used to parametrize low-dimensional structures on the dataset, which allow for high-accuracy labelings with only a small number of carefully chosen training labels.  ...  We are grateful to the anonymous reviewer for many helpful comments and suggestions which significantly improved the manuscript.  ... 
doi:10.3934/fods.2019012 fatcat:m36iyj5jkrdzxdnivlgx27jbaa
« Previous Showing results 1 — 15 out of 40,029 results