Filters








424,003 Hits in 3.2 sec

Deep Distribution Regression [article]

Rui Li, Howard D. Bondell, Brian J. Reich
2019 arXiv   pre-print
In this article, we provide a general solution by transforming a conditional distribution estimation problem into a constrained multi-class classification problem, in which tools such as deep neural networks  ...  Due to their flexibility and predictive performance, machine-learning based regression methods have become an important tool for predictive modeling and forecasting.  ...  Note that we are not aware of results showing that deep learning estimation ofπ k (X) obtains properties (iii) and (iv).  ... 
arXiv:1903.06023v1 fatcat:pnh3q2jeereobat5qlkjbd27l4

Marginally-calibrated deep distributional regression [article]

Nadja Klein, David J. Nott, Michael Stanley Smith
2020 arXiv   pre-print
However, our main motivating applications are in likelihood-free inference, where distributional deep regression is used to estimate marginal posterior distributions.  ...  Deep neural network (DNN) regression models are widely used in applications requiring state-of-the-art predictive accuracy.  ...  To do so we develop a new scalable method for 'distributional deep regression', by which we mean a DNN regression method that provides predictions for the full distribution.  ... 
arXiv:1908.09482v2 fatcat:z6fckteetjgnfdtryt6y37cn3m

Semi-Structured Distributional Regression – Extending Structured Additive Models by Arbitrary Deep Neural Networks and Data Modalities [article]

David Rügamer, Chris Kolb, Nadja Klein
2022 arXiv   pre-print
We propose a general framework to combine structured regression models and deep neural networks into a unifying network architecture.  ...  Combining additive models and neural networks allows to broaden the scope of statistical regression and extend deep learning-based approaches by interpretable structured additive predictors at the same  ...  Our proposal can also be seen as an extension of mean regression wide and deep models to distributional wide and deep networks.  ... 
arXiv:2002.05777v5 fatcat:enuxjymgfzaljo7ocqcbkckth4

deepregression: a Flexible Neural Network Framework for Semi-Structured Deep Distributional Regression [article]

David Rügamer, Chris Kolb, Cornelius Fritz, Florian Pfisterer, Philipp Kopper, Bernd Bischl, Ruolin Shen, Christina Bukas, Lisa Barros de Andrade e Sousa, Dominik Thalmeier, Philipp Baumann, Lucas Kook (+2 others)
2022 arXiv   pre-print
In this paper we describe the implementation of semi-structured deep distributional regression, a flexible framework to learn conditional distributions based on the combination of additive regression models  ...  and deep networks.  ...  Semi-structured deep distributional regression SDDR is based on ideas from distributional regression for parametric distributions (see, e.g., Klein, Kneib, Lang, and Sohn 2015) , to estimate the entire  ... 
arXiv:2104.02705v3 fatcat:6oo5btaohjbtjpoau7eto2fxfq

Unimodal regularisation based on beta distribution for deep ordinal regression

Víctor Manuel Vargas, Pedro Antonio Gutiérrez, César Hervás-Martínez
2021 Pattern Recognition  
The regularised loss function is used to train a deep neural network model with an ordinal scheme in the output layer.  ...  Currently, the use of deep learning for solving ordinal classification problems, where categories follow a natural order, has not received much attention.  ...  Another common approach is to convert the ordinal regression problem to a standard regression one [24] .  ... 
doi:10.1016/j.patcog.2021.108310 fatcat:gmblepvzenh3zfil45frtjlk4e

Structure and Distribution Metric for Quantifying the Quality of Uncertainty: Assessing Gaussian Processes, Deep Neural Nets, and Deep Neural Operators for Regression [article]

Ethan Pickering, Themistoklis P. Sapsis
2022 arXiv   pre-print
We propose two bounded comparison metrics that may be implemented to arbitrary dimensions in regression tasks.  ...  We apply these metrics to Gaussian Processes (GPs), Ensemble Deep Neural Nets (DNNs), and Ensemble Deep Neural Operators (DNOs) on high-dimensional and nonlinear test cases.  ...  Here, we specifically consider Gaussian process (GP) regression, deep neural networks (DNNs/NNs) and deep neural operators (DN0s) for this purpose.  ... 
arXiv:2203.04515v1 fatcat:u6haoyjmcjag3jydagkwr66qoq

Space-Time Distribution Laws of Tunnel Excavation Damaged Zones (EDZs) in Deep Mines and EDZ Prediction Modeling by Random Forest Regression

Qiang Xie, Kang Peng
2019 Advances in Civil Engineering  
In a layered rock mass, the distribution of EDZs is more difficult to identify.  ...  The space-time distribution laws of the range of EDZs during the excavation process of the roadway were analyzed.  ...  As a result, the radial stress decreases, and the tangential stress increases [13] . e deep rocks distributed in the area of redistribution stress are called engineering surrounding rocks.  ... 
doi:10.1155/2019/6505984 fatcat:6zt4t2p2pndz5lpycgt2ecqaju

Deep Ensembles from a Bayesian Perspective [article]

Lara Hoffmann, Clemens Elster
2021 arXiv   pre-print
Deep ensembles can be considered as the current state-of-the-art for uncertainty quantification in deep learning.  ...  We show that deep ensembles can be viewed as an approximate Bayesian method by specifying the corresponding assumptions.  ...  Deep ensembles as a Bayesian approximation In this work we focus on regression problems.  ... 
arXiv:2105.13283v2 fatcat:stxol5heqrbypc7suuv5uqer5q

Regression Based Clustering by Deep Adversarial Learning

Fei Tang, Dabin Zhang, Tie Cai, Qin Li
2020 IEEE Access  
In this paper, we utilize deep adversarial regression to tackle these problems and formulate regression based clustering by deep adversarial learning (RCDA).  ...  and target distribution so that improve representations learning.  ...  (DSC) [32] , latent distribution preserving deep subspace clustering (DPSC) [33] , deep clustering with sample-assignment invariance prior (DCSAIP) [34] .  ... 
doi:10.1109/access.2020.3014631 fatcat:zmws2m7o4fezhetj2pkbwwpgjy

Visualizing the decision-making process in deep neural decision forest [article]

Shichao Li, Kwang-Ting Cheng
2019 arXiv   pre-print
We then apply NDF on a multi-task coordinate regression problem and demonstrate the distribution of routing probabilities, which is vital for interpreting NDF yet not shown for regression problems.  ...  Deep neural decision forest (NDF) achieved remarkable performance on various vision tasks via combining decision tree and deep representation learning.  ...  Here we also demonstrate the distribution of routing probabilities for a regression problem.  ... 
arXiv:1904.09201v1 fatcat:4zm5va2z2ng5vluuduw2nnj7ni

Noise-Sampling Cross Entropy Loss: Improving Disparity Regression Via Cost Volume Aware Regularizer [article]

Yang Chen, Zongqing Lu, Xuechen Zhang, Lei Chen, Qingmin Liao
2020 arXiv   pre-print
Recent end-to-end deep neural networks for disparity regression have achieved the state-of-the-art performance.  ...  However, many well-acknowledged specific properties of disparity estimation are omitted in these deep learning algorithms.  ...  cost volume produced by deep disparity regression network.  ... 
arXiv:2005.08806v2 fatcat:gcsvudoz4naybmq6a2nk5c2wdu

An Attention-Guided Deep Regression Model for Landmark Detection in Cephalograms [article]

Zhusi Zhong, Jie Li, Zhenxi Zhang, Zhicheng Jiao, Xinbo Gao
2020 arXiv   pre-print
The proposed frame-work is based on 2-stage u-net, regressing the multi-channel heatmaps for land-mark detection.  ...  In this paper, we propose a deep learning based framework to au-tomatically detect anatomical landmarks in cephalometric X-ray images.  ...  And our deep regression model is easily generalized to other landmark detection tasks. Fig. 1 . 1 Overall framework of the Attention-Guided deep regression model.  ... 
arXiv:1906.07549v2 fatcat:kiflhrcgnrfvhc2mti66ifcdae

Distributional Transformation Improves Decoding Accuracy When Predicting Chronological Age From Structural MRI

Joram Soch
2020 Frontiers in Psychiatry  
We found that (i) when the number of features is low, no method outperforms linear regression; and (ii) except when using deep regression, distributional transformation increases decoding performance,  ...  In a low-dimensional setting, i.e., with less features than observations, we applied multiple linear regression, support vector regression and deep neural networks for out-of-sample prediction of subject  ...  GLM, multiple linear regression; SVR, support vector regression; DNN, deep neural network regression; DT, distributional transformation.  ... 
doi:10.3389/fpsyt.2020.604268 pmid:33363488 pmcid:PMC7752921 fatcat:h7mfzekcpfb27owr5h3kzxg4xu

A framework for benchmarking uncertainty in deep regression [article]

Franko Schmähling, Jörg Martin, Clemens Elster
2021 arXiv   pre-print
We propose a framework for the assessment of uncertainty quantification in deep regression.  ...  We illustrate the proposed framework by applying it to current approaches for uncertainty quantification in deep regression.  ...  Deep regression intends to infer the regression function G(x) T γ, and it does not make use of the specific structure of (1).  ... 
arXiv:2109.09048v1 fatcat:i3s7f4dqwzgbffdt7uolihuvme

Distributional Transformation improves Decoding Accuracy when Predicting Chronological Age from Structural MRI [article]

Joram Soch
2020 bioRxiv   pre-print
We found that (i) when the number of features is low, no method outperforms linear regression; and (ii) except when using deep regression, distributional transformation increases decoding performance,  ...  In a low-dimensional setting, i.e. with less features than observations, we applied multiple linear regression, support vector regression and deep neural networks for out-of-sample prediction of subject  ...  Abbreviations: GLM = multiple linear regression; SVR = support vector regression; DNN = deep neural network regression; DT = distributional transformation.  ... 
doi:10.1101/2020.09.11.293811 fatcat:y2qgjpwmybgnbcdjmpgaowgwwe
« Previous Showing results 1 — 15 out of 424,003 results