A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Real-Time Uncertainty Estimation in Computer Vision via Uncertainty-Aware Distribution Distillation
[article]
2020
arXiv
pre-print
We propose a simple, easy-to-optimize distillation method for learning the conditional predictive distribution of a pre-trained dropout model for fast, sample-free uncertainty estimation in computer vision ...
uncertainty quantification, while achieving improved quality of both the uncertainty estimates and predictive performance over the regular dropout model. ...
The running time of MC Dropout is optimized by caching results before the first dropout layer for a fair comparison. ...
arXiv:2007.15857v2
fatcat:j4gzazka35btlnv3u3zl7gjyni
STUN: Self-Teaching Uncertainty Estimation for Place Recognition
[article]
2022
arXiv
pre-print
During the online inference phase, we only use the student net to generate a place prediction in conjunction with the uncertainty. ...
However, a place recognition in the wild often suffers from erroneous predictions due to image variations, e.g., changing viewpoints and street appearance. ...
BTL [5] : We follow the parameters of the original paper without extra modification for a fair comparison. ...
arXiv:2203.01851v1
fatcat:7kwmog5f3fffrads64cvx6bf7a
Gaze Training by Modulated Dropout Improves Imitation Learning
[article]
2019
arXiv
pre-print
Prediction error in steering commands is reduced by 23.5% compared to uniform dropout. ...
Consistent with these results, the gaze-modulated dropout net shows lower model uncertainty. ...
Gaze-Modulated Dropout Evaluation
1) Prediction error vs. drop probability: To make a fair comparison, we scan over dps from 0.1 to 0.8 with a step of 0.1. ...
arXiv:1904.08377v2
fatcat:yleqwqx2nfbjbab5ela26p67ne
Iterative Distillation for Better Uncertainty Estimates in Multitask Emotion Recognition
[article]
2021
arXiv
pre-print
Our method generates single student models that provide accurate estimates of uncertainty for in-domain samples and a student ensemble that can detect out-of-domain samples. ...
From a Bayesian perspective, we propose to use deep ensembles to capture uncertainty for multiple emotion descriptors, i.e., action units, discrete expression labels and continuous descriptors. ...
For a fair comparison, we adopted the test-time cross-validation in [1] to compute the NLL in TS. The optimal temperature was optimized on a randomlysplit half of the validation set. ...
arXiv:2108.04228v2
fatcat:zlqous3inbdnnlvmvzhsvwtqsy
Semi-supervised Left Atrium Segmentation with Mutual Consistency Training
[article]
2021
arXiv
pre-print
We believe that these unlabeled regions may contain more crucial information to minimize the uncertainty prediction for the model and should be emphasized in the training process. ...
Such mutual consistency encourages the two decoders to have consistent and low-entropy predictions and enables the model to gradually capture generalized features from these unlabeled challenging regions ...
We also appreciate the efforts devoted to collect and share the LA database [16] and several available repositories [6, 7, 17] . ...
arXiv:2103.02911v2
fatcat:q5la4hz2hvcffeqju5e47pok5a
College Student Retention Risk Analysis From Educational Database using Multi-Task Multi-Modal Neural Fusion
[article]
2021
arXiv
pre-print
We develop a Multimodal Spatiotemporal Neural Fusion network for Multi-Task Learning (MSNF-MTCL) to predict 5 important students' retention risks: future dropout, next semester dropout, type of dropout ...
, duration of dropout and cause of dropout. ...
[32] proposed a fair student dropout prediction system from educational database. ...
arXiv:2109.05178v1
fatcat:4adtjrfj2ba6pceqv4am3kwuda
Assembly Quality Detection Based on Class-Imbalanced Semi-Supervised Learning
2021
Applied Sciences
Based on the mean teacher algorithm, the proposed algorithm uses certainty to select reliable teacher predictions for student learning dynamically, and loss functions are modified to improve the model's ...
Due to the imperfect assembly process, the unqualified assembly of a missing gasket or lead seal will affect the product's performance and possibly cause safety accidents. ...
Acknowledgments: This work was supported by the State Key Laboratory of Modern Optical Instrumentation of Zhejiang University and Zernike Optics Co., Ltd. ...
doi:10.3390/app112110373
fatcat:dhzewau3r5ffpml5h7vez5gtuu
Certainty Driven Consistency Loss on Multi-Teacher Networks for Semi-Supervised Learning
[article]
2021
arXiv
pre-print
In this paper, we propose a novel Certainty-driven Consistency Loss (CCL) that exploits the predictive uncertainty in the consistency loss to let the student dynamically learn from reliable targets. ...
Typically, a student model is trained to be consistent with teacher prediction for the inputs under different perturbations. ...
In Filtering CCL, shown inFig. 1 (a), the teacher filters out uncertain predictions and gradually selects a subset of certain predictions (i.e. of low uncertainty), that are robust targets for the student ...
arXiv:1901.05657v7
fatcat:zovhmsmelfh3nah2t7bsz27gpu
Uncertainty-aware Mean Teacher for Source-free Unsupervised Domain Adaptive 3D Object Detection
[article]
2021
arXiv
pre-print
Effectively, we perform automatic soft-sampling of pseudo-labeled data while aligning predictions from the student and teacher networks. ...
In order to avoid reinforcing errors caused by label noise, we propose an uncertainty-aware mean teacher framework which implicitly filters incorrect pseudo-labels during training. ...
-Initialize student model with φ s for 1 ≤ epoch ≤ num epochs do -Train student net with target data annotated with pseudo-labels {Y pt i,J } M i=1
i=1
To ensure a fair comparison across all ...
arXiv:2109.14651v1
fatcat:5wo4dotrb5ejtnqcgqneagnkyi
Inconsistency-aware Uncertainty Estimation for Semi-supervised Medical Image Segmentation
[article]
2021
arXiv
pre-print
In this paper, we investigate a novel method of estimating uncertainty. ...
Therefore, we present a new semi-supervised segmentation model, namely, conservative-radical network (CoraNet in short) based on our uncertainty estimation and separate self-training strategy. ...
In this paper, we proposed a novel method of estimating uncertainty by capturing the inconsistent prediction between multiple cost-sensitive settings. ...
arXiv:2110.08762v1
fatcat:adu3h2pqgba4fh2efryh6tofqi
Leveraging Labeling Representations in Uncertainty-based Semi-supervised Segmentation
[article]
2022
arXiv
pre-print
The predictions of unlabeled data are not reliable, therefore, uncertainty-aware methods have been proposed to gradually learn from meaningful and reliable predictions. ...
A prominent way to utilize the unlabeled data is by consistency training which commonly uses a teacher-student network, where a teacher guides a student segmentation. ...
Acknowledgments: This research work was partly funded by the Canada Research Chair on Shape Analysis in Medical Imaging, the Natural Sciences and Engineering Research Council of Canada (NSERC), and the ...
arXiv:2203.05682v1
fatcat:fcfcg7salvfx3b64shxwu7e4di
Uncertainty-Guided Mutual Consistency Learning for Semi-Supervised Medical Image Segmentation
[article]
2021
arXiv
pre-print
Medical image segmentation is a fundamental and critical step in many clinical approaches. ...
In this paper, we propose a novel uncertainty-guided mutual consistency learning framework to effectively exploit unlabeled data by integrating intra-task consistency learning from up-to-date predictions ...
fair comparison. ...
arXiv:2112.02508v1
fatcat:ofgv42dygvhyxphgh2wbcgdvoy
Predicting Math Student Success in the Initial Phase of College With Sparse Information Using Approaches From Statistical Learning
2020
Frontiers in Education
We investigate the completion of a first semester course as a dropout indicator and thereby provide not only good predictions, but also generate interpretable and practicable results together with easy-to-understand ...
In math teacher education, dropout research relies mostly on frameworks which carry out extensive variable collections leading to a lack of practical applicability. ...
This means the subject of research in this paper is not dropouts (from university or the study program) but dropouts and success in this lecture in the sense of a non-completion rate. ...
doi:10.3389/feduc.2020.502698
fatcat:4frsjsgli5e3pjsbzo7kpook64
A Review of Uncertainty Quantification in Deep Learning: Techniques, Applications and Challenges
[article]
2021
arXiv
pre-print
Uncertainty quantification (UQ) plays a pivotal role in reduction of uncertainties during both optimization and decision making processes. ...
It can be applied to solve a variety of real-world applications in science and engineering. ...
A schematic comparison of the three different uncertainty models [9] (MC dropout, Boostrap model and GMM is provided in Fig. 2 . ...
arXiv:2011.06225v4
fatcat:wwnl7duqwbcqbavat225jkns5u
Adversarial Distillation of Bayesian Neural Network Posteriors
[article]
2018
arXiv
pre-print
Bayesian neural networks (BNNs) allow us to reason about uncertainty in a principled way. ...
However, SGLD and its extensions require storage of many copies of the model parameters, a potentially prohibitive cost, especially for large neural networks. ...
We did not use momentum, for fair comparison with vanilla SGLD, which did not use momentum. ...
arXiv:1806.10317v1
fatcat:lzsdxuaxbffjxao3jzzpus4htm
« Previous
Showing results 1 — 15 out of 2,647 results