Filters








2,448 Hits in 7.1 sec

Calibrated Multiple-Output Quantile Regression with Representation Learning [article]

Shai Feldman, Stephen Bates, Yaniv Romano
2021 arXiv   pre-print
Existing multiple-output quantile regression approaches are effective in such cases, so we apply them on the learned representation, and then transform the solution to the original space of the response  ...  First, we use a deep generative model to learn a representation of the response that has a unimodal distribution.  ...  The procedure we propose is generic and can be applied to any multiple-output quantile regression method, including those we discuss in this work.  ... 
arXiv:2110.00816v1 fatcat:mcfxrg4fdnhv3ejbluswaqzlha

A framework for probabilistic weather forecast post-processing across models and lead times using machine learning [article]

Charlie Kirkwood, Theo Economou, Henry Odbert, Nicolas Pugeault
2020 arXiv   pre-print
well-calibrated probabilistic output.  ...  First, we use Quantile Regression Forests to learn the error profile of each numerical model, and use these to apply empirically-derived probability distributions to forecasts.  ...  The lead author is grateful for the insightful discussions and community feedback that came from attending the Machine Learning for Weather and Climate Modelling conference at the University of Oxford  ... 
arXiv:2005.06613v2 fatcat:aae3pkztobcqjgbi3v6lbt6fsq

Beyond expectation: Deep joint mean and quantile regression for spatio-temporal problems [article]

Filipe Rodrigues, Francisco C. Pereira
2018 arXiv   pre-print
In this paper, we propose a multi-output multi-quantile deep learning approach for jointly modeling several conditional quantiles together with the conditional expectation as a way to provide a more complete  ...  Using two large-scale datasets from the transportation domain, we empirically demonstrate that, by approaching the quantile regression problem from a multi-task learning perspective, it is possible to  ...  In our proposed approach, this is done by having a common latent representation learned by the ConvLSTM layers for the multiple tasks, and by using hard-parameter sharing in the proposed output layer (  ... 
arXiv:1808.08798v1 fatcat:k6js3cry4rcfxjhmqiolmeikkm

Distribution Calibration for Regression [article]

Hao Song, Tom Diethe, Meelis Kull, Peter Flach
2019 arXiv   pre-print
We are concerned with obtaining well-calibrated output distributions from regression models.  ...  We further propose a post-hoc approach to improving the predictions from previously trained regression models, using multi-output Gaussian Processes with a novel Beta link function.  ...  Beta link function for regression We first adopt the parametric Beta calibration map family [16, 17] as a tool to calibrate the CDF of any regression output, by transforming quantiles with a beta calibration  ... 
arXiv:1905.06023v1 fatcat:u3kqvmyinngf5dpbosqp3f7y3y

Calibrated and Sharp Uncertainties in Deep Learning via Simple Density Estimation [article]

Volodymyr Kuleshov, Shachi Deshpande
2021 arXiv   pre-print
Our methods focus on the strongest notion of calibration--distribution calibration--and enforce it by fitting a low-dimensional density or quantile function with a neural estimator.  ...  The resulting approach is much simpler and more broadly applicable than previous methods across both classification and regression.  ...  technique based on quantile function regression with a neural network estimator.  ... 
arXiv:2112.07184v1 fatcat:qaw6we4azjcgvg26acudtyazre

Image-to-Image Regression with Distribution-Free Uncertainty Quantification and Applications in Imaging [article]

Anastasios N Angelopoulos, Amit P Kohli, Stephen Bates, Michael I Jordan, Jitendra Malik, Thayer Alshaabi, Srigokul Upadhyayula, Yaniv Romano
2022 arXiv   pre-print
Image-to-image regression is an important learning task, used frequently in biological imaging.  ...  Our methods work in conjunction with any base machine learning model, such as a neural network, and endow it with formal mathematical guarantees -- regardless of the true unknown data distribution or choice  ...  For each example, we show the input, output, and the prediction sets generated by quantile regression along with the ground truth target.  ... 
arXiv:2202.05265v1 fatcat:2s6z76jnwfhhrnndkyivlxwqji

Few-shot Conformal Prediction with Auxiliary Tasks [article]

Adam Fisch, Tal Schuster, Tommi Jaakkola, Regina Barzilay
2021 arXiv   pre-print
Conformal prediction identifies a small set of promising output candidates in place of a single prediction, with guarantees that the set contains the correct answer with high probability.  ...  Our conformalization algorithm is simple, fast, and agnostic to the choice of underlying model, learning algorithm, or dataset.  ...  Finally, we 6 www.rdkit.org map the output representation to a hidden size of 16, and apply least-squares regression.  ... 
arXiv:2102.08898v2 fatcat:oektkhgbbzhejpms6vt2eympqq

Autoregressive Quantile Flows for Predictive Uncertainty Estimation [article]

Phillip Si, Allan Bishop, Volodymyr Kuleshov
2021 arXiv   pre-print
Numerous applications of machine learning involve predicting flexible probability distributions over model outputs.  ...  We propose Autoregressive Quantile Flows, a flexible class of probabilistic models over high-dimensional variables that can be used to accurately capture predictive aleatoric uncertainties.  ...  In quantile function regression, a baseline model H outputs a quantile function Q(α); in practice, we implement this model via a neural network f (x, α) that takes as input an extra α ∈ [0, 1] and outputs  ... 
arXiv:2112.04643v1 fatcat:bkshkseiijgh7hkqwhy7p5ehhi

Quantile Surfaces – Generalizing Quantile Regression to Multivariate Targets [article]

Maarten Bieshaar, Jens Schreiber, Stephan Vogt, André Gensler, Bernhard Sick
2020 arXiv   pre-print
Our approach is based on an extension of single-output quantile regression (QR) to multivariate-targets, called quantile surfaces (QS).  ...  Subsequently, we model the prediction uncertainty using QS involving neural networks called quantile surface regression neural networks (QSNN).  ...  Spatial relation is modeled by the multivariate output of a regression model, i.e., we aim to forecast the power generation of multiple farms simultaneously.  ... 
arXiv:2010.05898v1 fatcat:hbtu4z725feuxldsqeije6wxtq

Single-Model Uncertainties for Deep Learning [article]

Natasa Tagasovska, David Lopez-Paz
2019 arXiv   pre-print
To estimate aleatoric uncertainty, we propose Simultaneous Quantile Regression (SQR), a loss function to learn all the conditional quantiles of a given target variable.  ...  These quantiles can be used to compute well-calibrated prediction intervals.  ...  Introduction Deep learning permeates our lives, with prospects to drive our cars and decide on our medical treatments.  ... 
arXiv:1811.00908v3 fatcat:5kx3elc2j5hhrhhafbsd3ree3m

Statistical Postprocessing for Weather Forecasts – Review, Challenges and Avenues in a Big Data World [article]

Stéphane Vannitsem, John Bjørnar Bremnes, Jonathan Demaeyer, Gavin R. Evans, Jonathan Flowerdew, Stephan Hemri, Sebastian Lerch, Nigel Roberts, Susanne Theis, Aitor Atencia, Zied Ben Bouallègue, Jonas Bhend (+12 others)
2020 arXiv   pre-print
Statistical postprocessing techniques are nowadays key components of the forecasting suites in many National Meteorological Services (NMS), with for most of them, the objective of correcting the impact  ...  The paper is an attempt to summarize the main activities going on this area from theoretical developments to operational applications, with a focus on the current challenges and potential avenues in the  ...  (including multiple predictors) and output.  ... 
arXiv:2004.06582v1 fatcat:kjruwwm7krgwjp3rrxdb5twrhe

Statistical Postprocessing for Weather Forecasts – Review, Challenges and Avenues in a Big Data World

Stéphane Vannitsem, John Bjørnar Bremnes, Jonathan Demaeyer, Gavin R. Evans, Jonathan Flowerdew, Stephan Hemri, Sebastian Lerch, Nigel Roberts, Susanne Theis, Aitor Atencia, Zied Ben Bouallègue, Jonas Bhend (+12 others)
2020 Bulletin of The American Meteorological Society - (BAMS)  
Capsule State-of-the-Art statistical postprocessing techniques for ensemble forecasts are reviewed, together with the challenges posed by a demand for timely, high-resolution and reliable probabilistic  ...  multiple predictors) and output.  ...  Taillardat et al. (2016) propose a postprocessing model using quantile regression forests, a quantile regression method where predictive quantiles are computed based on random forests (Breiman 2001;  ... 
doi:10.1175/bams-d-19-0308.1 fatcat:zhjx7otdzbfhfdouyz4qcjbsu4

Modelling heterogeneous distributions with an Uncountable Mixture of Asymmetric Laplacians [article]

Axel Brando, Jose A. Rodríguez-Serrano, Jordi Vitrià, Alberto Rubio
2019 arXiv   pre-print
variable and shows its connections to quantile regression.  ...  In this paper, we propose a generic deep learning framework that learns an Uncountable Mixture of Asymmetric Laplacians (UMAL), which will allow us to estimate heterogeneous distributions of the output  ...  Figure 1 : 1 Regression problem with heterogeneous output distributions modelled with UMAL.  ... 
arXiv:1910.12288v2 fatcat:2j7us5ckdrfdnifd2klc2leozi

Beyond Pinball Loss: Quantile Methods for Calibrated Uncertainty Quantification [article]

Youngseog Chung, Willie Neiswanger, Ian Char, Jeff Schneider
2021 arXiv   pre-print
A model that predicts the true conditional quantiles for each input, at all quantile levels, presents a correct and efficient representation of the underlying uncertainty.  ...  In particular, we propose methods that can apply to any class of regression model, allow for selecting a trade-off between calibration and sharpness, optimize for calibration of centered intervals, and  ...  MAQR has a two-step training process: we first learn a mean model, then construct a quantile dataset D, then regress onto this dataset with the quantile model.  ... 
arXiv:2011.09588v4 fatcat:amkf3pw5tbchxe77lju6ls6kw4

Uncertainty-Aware Time-to-Event Prediction using Deep Kernel Accelerated Failure Time Models [article]

Zhiliang Wu, Yinchong Yang, Peter A. Fasching, Volker Tresp
2021 arXiv   pre-print
Furthermore, a deep metric learning based pre-training step is adapted to enhance the proposed model.  ...  with an SVGP output layer.  ...  In contrast, the DKAFT models with a PPGP output layer outperform the ones with an SVGP output layer in the LoS-MIMIC prediction task.  ... 
arXiv:2107.12250v1 fatcat:t2bms2olijc2bfbxhwt3qcazyq
« Previous Showing results 1 — 15 out of 2,448 results