Filters








13,192 Hits in 4.6 sec

Post-hoc Calibration of Neural Networks by g-Layers [article]

Amir Rahimi, Thomas Mensink, Kartik Gupta, Thalaiyasingam Ajanthan, Cristian Sminchisescu, Richard Hartley
2022 arXiv   pre-print
In recent years, there is a surge of research on neural network calibration and the majority of the works can be categorized into post-hoc calibration methods, defined as methods that learn an additional  ...  This not only provides a less stringent condition to obtain a calibrated network but also provides a theoretical justification of post-hoc calibration methods.  ...  during training, after early-stopping or through post-hoc calibration.  ... 
arXiv:2006.12807v2 fatcat:6chv3r3fw5dpda7lcepzzjy3qi

Parameterized Temperature Scaling for Boosting the Expressive Power in Post-Hoc Uncertainty Calibration [article]

Christian Tomani, Daniel Cremers, Florian Buettner
2021 arXiv   pre-print
Standard deep neural networks typically yield uncalibrated predictions, which can be transformed into calibrated confidence scores using post-hoc calibration methods.  ...  In this contribution, we demonstrate that the performance of accuracy-preserving state-of-the-art post-hoc calibrators is limited by their intrinsic expressive power.  ...  Since standard neural networks tend to yield systematically overconfident predictions [5] , a number of algorithms for post-hoc calibration have been proposed.  ... 
arXiv:2102.12182v1 fatcat:h7oclq2j2fdajcmjf6ndver6ee

Post-hoc Uncertainty Calibration for Domain Drift Scenarios [article]

Christian Tomani, Sebastian Gruber, Muhammed Ebrar Erdem, Daniel Cremers, Florian Buettner
2021 arXiv   pre-print
While standard deep neural networks typically yield uncalibrated predictions, calibrated confidence scores that are representative of the true likelihood of a prediction can be achieved using post-hoc  ...  First, we show that existing post-hoc calibration methods yield highly over-confident predictions under domain shift.  ...  Since deep neural networks typically only yield uncalibrated confidence scores, a variety of different post-hoc calibration approaches have been proposed [15, 4, 23, 22, 24] .  ... 
arXiv:2012.10988v2 fatcat:ngm2tccxqna33g4acyedur6d54

Short-Term Solar Irradiance Forecasting Using Calibrated Probabilistic Models [article]

Eric Zelikman, Sharon Zhou, Jeremy Irvin, Cooper Raterink, Hao Sheng, Anand Avati, Jack Kelly, Ram Rajagopal, Andrew Y. Ng, David Gagne
2020 arXiv   pre-print
We investigate the use of post-hoc calibration techniques for ensuring well-calibrated probabilistic predictions.  ...  Further, we show that NGBoost with CRUDE post-hoc calibration achieves comparable performance to a numerical weather prediction model on hourly-resolution forecasting.  ...  Generally, with post-hoc calibration, the variational neural network and NGBoost were sharper than the Gaussian process and dropout- uncertainty based neural network.  ... 
arXiv:2010.04715v2 fatcat:liqhgwehirfkbhamycsbu7rfya

Rethinking Calibration of Deep Neural Networks: Do Not Be Afraid of Overconfidence

Deng-Bao Wang, Lei Feng, Min-Ling Zhang
2021 Neural Information Processing Systems  
the room of potential improvement in post-hoc calibration phase.  ...  However, modern neural networks have been found to be poorly calibrated, primarily in the direction of overconfidence.  ...  Plan Guided by the Ministry of Education.  ... 
dblp:conf/nips/WangFZ21 fatcat:6nn34o4anraypbeoq3u4en4rtq

Soft Calibration Objectives for Neural Networks [article]

Archit Karandikar, Nicholas Cain, Dustin Tran, Balaji Lakshminarayanan, Jonathon Shlens, Michael C. Mozer, Becca Roelofs
2021 arXiv   pre-print
However, deep neural networks are often under- or over-confident in their predictions.  ...  Consequently, methods have been developed to improve the calibration of their predictive uncertainty both during training and post-hoc.  ...  Acknowledgements The authors thank Brennan McConnell and Mohammad Khajah who conducted initial explorations of soft binning calibration loss.  ... 
arXiv:2108.00106v2 fatcat:oxgu3qitdzb55b7lehp5ces5ii

Post Training Uncertainty Calibration of Deep Networks For Medical Image Segmentation [article]

Axel-Jan Rousseau, Thijs Becker, Jeroen Bertels, Matthew B. Blaschko, Dirk Valkenborg
2020 arXiv   pre-print
Neural networks for automated image segmentation are typically trained to achieve maximum accuracy, while less attention has been given to the calibration of their confidence scores.  ...  In all cases, post hoc calibration is competitive with MC dropout. Although average calibration improves compared to the base model, subject-level variance of the calibration remains similar.  ...  They evaluated the performance of various post hoc calibration methods on NN classifiers and found an extension of Platt scaling to be surprisingly effective.  ... 
arXiv:2010.14290v1 fatcat:bmlroqbv2ve2rhwx6ex7bsffmm

Post Training Uncertainty Calibration Of Deep Networks For Medical Image Segmentation

Axel-Jan Rousseau, Thijs Becker, Jeroen Bertels, Matthew B. Blaschko, Dirk Valkenborg
2021 2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI)  
Neural networks for automated image segmentation are typically trained to achieve maximum accuracy, while less attention has been given to the calibration of their confidence scores.  ...  In all cases, post hoc calibration is competitive with MC dropout. Although average calibration improves compared to the base model, subject-level variance of the calibration remains similar.  ...  They evaluated the performance of various post hoc calibration methods on NN classifiers and found an extension of Platt scaling to be surprisingly effective.  ... 
doi:10.1109/isbi48211.2021.9434131 fatcat:x3ueqvyj6bhkjf6iexr3oikbnm

Calibrating Deep Neural Network Classifiers on Out-of-Distribution Datasets [article]

Zhihui Shao, and Jianyi Yang, Shaolei Ren
2020 arXiv   pre-print
To increase the trustworthiness of deep neural network (DNN) classifiers, an accurate prediction confidence that represents the true likelihood of correctness is crucial.  ...  Towards this end, many post-hoc calibration methods have been proposed to leverage a lightweight model to map the target DNN's output layer into a calibrated confidence.  ...  We propose a new post-hoc confidence calibration method, called CCAC (Confidence Calibration with an Auxiliary Class), which building on top of a lightweight (e.g., 2-layer) neural network, only needs  ... 
arXiv:2006.08914v1 fatcat:4wkxrr74wzdgla3sizw46crypm

Intra Order-preserving Functions for Calibration of Multi-Class Neural Networks [article]

Amir Rahimi, Amirreza Shaban, Ching-An Cheng, Richard Hartley, Byron Boots
2020 arXiv   pre-print
A common approach is to learn a post-hoc calibration function that transforms the output of the original network into calibrated confidence scores while maintaining the network's accuracy.  ...  However, previous post-hoc calibration techniques work only with simple calibration functions, potentially lacking sufficient representation to calibrate the complex function landscape of deep networks  ...  In short, calibrating neural network with this new family of functions generalizes many existing calibration techniques, with additional flexibility to express the post-hoc calibration function.  ... 
arXiv:2003.06820v2 fatcat:urc6jgusafb77ecv36johiowka

Towards reliable and fair probabilistic predictions: field-aware calibration with neural networks [article]

Feiyang Pan, Xiang Ao, Pingzhong Tang, Min Lu, Dapeng Liu, Qing He
2019 arXiv   pre-print
We propose Neural Calibration, a new calibration method, which learns to calibrate by making full use of all input information over the validation set.  ...  The results show that Neural Calibration significantly improves against uncalibrated predictions in all well-known metrics such as the negative log-likelihood, the Brier score, the AUC score, as well as  ...  traditional post-hoc calibration methods is that they are not robust in the face of data shift.  ... 
arXiv:1905.10713v2 fatcat:ujgm326ovvemjjmugd6molszv4

Investigation of Uncertainty of Deep Learning-based Object Classification on Radar Spectra [article]

Kanil Patel, William Beluch, Kilian Rambach, Adriana-Eliza Cozma, Michael Pfeiffer, Bin Yang
2021 arXiv   pre-print
We show that by applying state-of-the-art post-hoc uncertainty calibration, the quality of confidence measures can be significantly improved,thereby partially resolving the over-confidence problem.  ...  the predictions; however, decisions of DL networks are non-transparent.  ...  Two such methods are used in this paper to demonstrate the effectiveness of post-hoc methods to improve network calibration.  ... 
arXiv:2106.05870v1 fatcat:wicivzdfhrarpf7a3okninems4

Improving Uncertainty Calibration of Deep Neural Networks via Truth Discovery and Geometric Optimization [article]

Chunwei Ma, Ziyun Huang, Jiayi Xian, Mingchen Gao, Jinhui Xu
2022 arXiv   pre-print
Ensemble techniques and post-hoc calibrations are two types of approaches that have individually shown promise in improving the uncertainty calibration of DNNs.  ...  Furthermore, we show that post-hoc calibration can also be enhanced by truth discovery-regularized optimization.  ...  Post-hoc Calibration of Deep Neural Networks.  ... 
arXiv:2106.14662v3 fatcat:gbhsagjksbbrjlokw2owhzmjki

Intra-Processing Methods for Debiasing Neural Networks [article]

Yash Savani, Colin White, Naveen Sundar Govindarajulu
2020 arXiv   pre-print
In this work, we initiate the study of a new paradigm in debiasing research, intra-processing, which sits between in-processing and post-processing methods.  ...  Pre- or in-processing methods would require retraining the entire model from scratch, while post-processing methods only have black-box access to the model, so they do not leverage the weights of the trained  ...  In this work, we present a formal study of post-hoc methods for debiasing neural networks.  ... 
arXiv:2006.08564v2 fatcat:pauvm5izljcmldstpfhlbb3gda

Probabilistic Models for Manufacturing Lead Times [article]

Recep Yusuf Bekci, Yacine Mahdid, Jinling Xing, Nikita Letov, Ying Zhang, Zahid Pasha
2022 arXiv   pre-print
In this study, we utilize Gaussian processes, probabilistic neural network, natural gradient boosting, and quantile regression augmented gradient boosting to model lead times of laser manufacturing processes  ...  Our results indicate that all of the models beat the company estimation benchmark that uses domain experience and have good calibration with the empirical frequencies.  ...  The dashed line represents a perfect calibration. We give the results without a post-hoc calibration for brevity, post-hoc calibration results can be found in the Appendix.  ... 
arXiv:2204.13792v2 fatcat:u2p3m7sr6vbunegwycohhmxls4
« Previous Showing results 1 — 15 out of 13,192 results