A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Exploring the Uncertainty Properties of Neural Networks' Implicit Priors in the Infinite-Width Limit
[article]
2020
arXiv
pre-print
Previous benchmarks have found that ensembles of neural networks (NNs) are typically the best calibrated models on OOD data. ...
We use the NNGP with a softmax link function to build a probabilistic model for multi-class classification and marginalize over the latent Gaussian outputs to sample from the posterior. ...
In this work, we focus on FC and CNN-Vec NNGPs, whose kernels are derived from fully-connected networks and convolution networks without pooling respectively (see Sec A for precise definitions). ...
arXiv:2010.07355v1
fatcat:brwoxrmczff3pi2zqpb7nt5qz4
Universal uncertainty estimation for nuclear detector signals with neural networks and ensemble learning
[article]
2022
arXiv
pre-print
In this paper, we propose using multi-layer convolutional neural networks for empirical uncertainty estimation and feature extraction of nuclear pulse signals. ...
Furthermore, ensemble learning is utilized to estimate the uncertainty originated from trainable parameters of the network and improve the robustness of the whole model. ...
time series of signal, 𝒚 is the ground-truth label indicating the desired output, 𝜽 is the trainable parameters of the neural network, and 𝝁, 𝝈 2 are predictive mean and predictive variance, respectively ...
arXiv:2110.04975v3
fatcat:djvnpmf4rjg4bopbeejphzpjbm
Training BatchNorm Only in Neural Architecture Search and Beyond
[article]
2021
arXiv
pre-print
To this end, we propose a novel composite performance indicator to evaluate networks from three perspectives: expressivity, trainability, and uncertainty, derived from the theoretical property of BatchNorm ...
We begin by showing that the train-BN-only networks converge to the neural tangent kernel regime, obtain the same training dynamics as train all parameters theoretically. ...
BN-only is a neural tangent kernel. ...
arXiv:2112.00265v1
fatcat:eckj67fp7redhoho4shyk6wnc4
Improved Trainable Calibration Method for Neural Networks on Medical Imaging Classification
[article]
2020
arXiv
pre-print
Empirically, neural networks are often miscalibrated and overconfident in their predictions. ...
The proposed approach is based on expected calibration error, which is a common metric for quantifying miscalibration. ...
p − p . (2)
Measurements Expected Calibration Error (ECE) is a commonly used criterion for measuring neural network calibration error. ...
arXiv:2009.04057v1
fatcat:oved252a6bhbpiofayz6e5cpmm
The IDLAB VoxSRC-20 Submission: Large Margin Fine-Tuning and Quality-Aware Score Calibration in DNN Based Speaker Verification
[article]
2021
arXiv
pre-print
It enables the network to create more robust speaker embeddings by enabling the use of longer training utterances in combination with a more aggressive margin penalty. ...
Large margin fine-tuning is a secondary training stage for DNN based speaker verification systems trained with margin-based loss functions. ...
We argue neural network based systems can benefit from quality measurements in the calibration step as well. ...
arXiv:2010.11255v2
fatcat:3n5jnj7c7ra23lgjdgs4h55aoq
The Idlab Voxsrc-20 Submission: Large Margin Fine-Tuning and Quality-Aware Score Calibration in DNN Based Speaker Verification
2021
ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
It enables the network to create more robust speaker embeddings by enabling the use of longer training utterances in combination with a more aggressive margin penalty. ...
Large margin fine-tuning is a secondary training stage for DNN based speaker verification systems trained with margin-based loss functions. ...
We argue neural network based systems can benefit from quality measurements in the calibration step as well. ...
doi:10.1109/icassp39728.2021.9414600
fatcat:zil3jqjohnh7zojziimsujdqy4
O2D2: Out-Of-Distribution Detector to Capture Undecidable Trials in Authorship Verification
[article]
2021
arXiv
pre-print
Our system is based on our 2020 winning submission, with updates to significantly reduce sensitivities to topical variations and to further improve the system's calibration by means of an uncertainty-adaptation ...
In this work, we present a novel hybrid neural-probabilistic framework that is designed to tackle the challenges of the 2021 task. ...
Project funding was provided by the state of North Rhine-Westphalia within the Research Training Group "SecHuman -Security for Humans in Cyberspace" and by the Deutsche Forschungsgemeinschaft (DFG) under ...
arXiv:2106.15825v3
fatcat:doplzuzw3jg3hmv2nhx6ps4l6m
Unrolled Primal-Dual Networks for Lensless Cameras
[article]
2022
arXiv
pre-print
Conventional image reconstruction models for lensless cameras often assume that each measurement results from convolving a given scene with a single experimentally measured point-spread function. ...
This improvement stems from our primary finding that embedding learnable forward and adjoint models in a learned primal-dual optimization framework can even improve the quality of reconstructed images ...
ACKNOWLEDGEMENT We thank Laura Waller, Kristina Monakhova, Tianjiao Zeng and Edmund Lam for their support in providing useful insights from their work; Tobias Ritschel for fruitful discussions at the early ...
arXiv:2203.04353v1
fatcat:hxuxy5p3mncsxhg4vbao7frsvq
Model Architecture Adaption for Bayesian Neural Networks
[article]
2022
arXiv
pre-print
Bayesian Neural Networks (BNNs) offer a mathematically grounded framework to quantify the uncertainty of model predictions but come with a prohibitive computation cost for both training and inference. ...
Different from canonical NAS that optimizes solely for in-distribution likelihood, the proposed scheme searches for the uncertainty performance using both in- and out-of-distribution data. ...
for the kernel size). ...
arXiv:2202.04392v1
fatcat:ejv2fuwrubdodfysvgv367ya3m
Uncertainty-Aware Time-to-Event Prediction using Deep Kernel Accelerated Failure Time Models
[article]
2021
arXiv
pre-print
We propose Deep Kernel Accelerated Failure Time models for the time-to-event prediction task, enabling uncertainty-awareness of the prediction by a pipeline of a recurrent neural network and a sparse Gaussian ...
Recurrent neural network based solutions are increasingly being used in the analysis of longitudinal Electronic Health Record data. ...
Acknowledgement The authors acknowledge the support by the German Federal Ministry for Education and Research (BMBF), funding project "MLWin" (grant 01IS18050). ...
arXiv:2107.12250v1
fatcat:t2bms2olijc2bfbxhwt3qcazyq
PYRO-NN: Python Reconstruction Operators in Neural Networks
[article]
2019
arXiv
pre-print
An increasingly number of publications follow the concept of embedding the CT reconstruction as a known operator into a neural network. ...
Conclusions: PYRO-NN comes with the prevalent deep learning framework Tensorflow and allows to setup end-to-end trainable neural networks in the medical image reconstruction context. ...
Additional financial support for this project was granted by the Emerging Fields Initiative (EFI) of the Friedrich-Alexander University Erlangen-Nürnberg(FAU). ...
arXiv:1904.13342v1
fatcat:wz3ly6gvpvbmvih76xdu5kmcyi
Artificial Intelligence for Mass Spectrometry and Nuclear Magnetic Resonance Spectroscopy Using a Novel Data Augmentation Method
2021
IEEE Transactions on Emerging Topics in Computing
This paper presents recent advances from two projects that use Artificial Neural Networks (ANNs) to address the challenges of automation and performance-efficient realizations of MS and NMR. ...
These processes are traditionally carried out manually and by a specialist, which takes a substantial amount of time and prevents their utilization for real-time closed-loop process control. ...
The authors would like to thank their colleagues Lukas Wander and Martin Bornemann-Pfeiffer for their support in conducting the experiments and fruitful discussions. ...
doi:10.1109/tetc.2021.3131371
fatcat:t3ofmg6zynbf5loi433be357mq
Graph neural networks for laminar flow prediction around random 2D shapes
[article]
2021
arXiv
pre-print
In the recent years, the domain of fast flow field prediction has been vastly dominated by pixel-based convolutional neural networks. ...
Yet, the recent advent of graph convolutional neural networks (GCNNs) have attracted a considerable attention in the computational fluid dynamics (CFD) community. ...
from that of the neural networks (NN) domain. ...
arXiv:2107.11529v2
fatcat:xtukkbktz5hlrjcklzbcbuwxku
Integrating Frequency Translational Invariance in TDNNs and Frequency Positional Information in 2D ResNets to Enhance Speaker Verification
[article]
2021
arXiv
pre-print
Currently, both Time Delay Neural Networks (TDNNs) and ResNets achieve state-of-the-art results in speaker verification. ...
This paper describes the IDLab submission for the text-independent task of the Short-duration Speaker Verification Challenge 2021 (SdSVC-21). ...
The converged network is subsequently used to extract low-dimensional speaker embeddings from a bottleneck layer in the final part of the network. ...
arXiv:2104.02370v1
fatcat:7wjdagibrjev7ib6fmzc4zdu7a
Learning from irregularly sampled data for endomicroscopy super-resolution: a comparative study of sparse and dense approaches
2020
International Journal of Computer Assisted Radiology and Surgery
We also implement trainable generalised NW kernel regression as a novel sparse approach. We also generated synthetic data for training pCLE SR. ...
It was shown that convolutional neural networks (CNNs) could improve pCLE image quality. Yet classical CNNs may be suboptimal in regard to irregular data. ...
This work was undertaken at UCL and UCLH, which receive a proportion of funding from the DoH NIHR UCLH BRC funding scheme. ...
doi:10.1007/s11548-020-02170-7
pmid:32415459
pmcid:PMC7316691
fatcat:5gqre4oyuvcixmqeaezay575nm
« Previous
Showing results 1 — 15 out of 759 results