Filters








142,056 Hits in 3.4 sec

Interpretable deep Gaussian processes with moments [article]

Chi-Ken Lu, Scott Cheng-Hsin Yang, Xiaoran Hao, Patrick Shafto
2019 arXiv   pre-print
Deep Gaussian Processes (DGPs) combine the expressiveness of Deep Neural Networks (DNNs) with quantified uncertainty of Gaussian Processes (GPs).  ...  Expressive power and intractable inference both result from the non-Gaussian distribution over composition functions.  ...  Deep SE Compositions We examine two different methods for obtaining the effective covariance function in SE[SE [SE]].  ... 
arXiv:1905.10963v3 fatcat:7vrmyucvajhyljxdc6zvlqxdvq

An Interpretable and Sample Efficient Deep Kernel for Gaussian Process

Yijue Dai, Tianjian Zhang, Zhidi Lin, Feng Yin, Sergios Theodoridis, Shuguang Cui
2020 Conference on Uncertainty in Artificial Intelligence  
We propose a novel Gaussian process kernel that takes advantage of a deep neural network (DNN) structure but retains good interpretability.  ...  The designed kernel does not sacrifice interpretability for optimality.  ...  GAUSSIAN PROCESSES REGRESSION A GP is a collection of random variables, any finite number of which have a joint Gaussian distribution (Williams and Rasmussen, 2006) .  ... 
dblp:conf/uai/DaiZLYTC20 fatcat:vc4ayh46hnagbf4b63364xbwuy

Interpretable Prediction of Urban Mobility Flows with Deep Neural Networks as Gaussian Processes

Aike Steentoft, Bu-Sung Lee, Markus Schläpfer
2022
To that end, we propose a Bayesian deep-learning approach that formulates deep neural networks as Gaussian processes and integrates automatic variable selection.  ...  However, existing methods cannot quantify the uncertainty of the predictions, limiting their interpretability and thus their use for practical applications in urban infrastructure planning.  ... 
doi:10.48350/169750 fatcat:45b6travinb5vhvlon7fyd52wa

New Directions for Learning with Kernels and Gaussian Processes (Dagstuhl Seminar 16481)

Arthur Gretton, Philipp Hennig, Carl Edward Rasmussen, Bernhard Schölkopf, Marc Herbstritt
2017 Dagstuhl Reports  
The Dagstuhl Seminar on 16481 "New Directions for Learning with Kernels and Gaussian Processes" brought together two principal theoretical camps of the machine learning community at a crucial time for  ...  Kernel methods and Gaussian process models together form a significant part of the discipline's foundations, but their prominence is waning while more elaborate but poorly understood hierarchical models  ...  Practical Challenges of Gaussian Process Applications Deep kernels and deep Gaussian processes We showed how we can construct deep kernels by composing their implicit features, and examine the properties  ... 
doi:10.4230/dagrep.6.11.142 dblp:journals/dagstuhl-reports/GrettonHRS16 fatcat:sky4bixr6fg3djtboe6ueirthi

Deep Kernel Learning [article]

Andrew Gordon Wilson, Zhiting Hu, Ruslan Salakhutdinov, Eric P. Xing
2015 arXiv   pre-print
We jointly learn the properties of these kernels through the marginal likelihood of a Gaussian process.  ...  On a large and diverse collection of applications, including a dataset with 2 million examples, we show improved performance over scalable Gaussian processes with flexible kernel learning models, and stand-alone  ...  Conditioned on all kernel hyperparameters, we can interpret our model as applying a Gaussian process with base kernel k θ to the final hidden layer of a deep network.  ... 
arXiv:1511.02222v1 fatcat:guzfr767yfaupjvorzo2rycmzy

Recent Advances in Data-Driven Wireless Communication Using Gaussian Processes: A Comprehensive Survey [article]

Kai Chen, Qinglei Kong, Yijue Dai, Yue Xu, Feng Yin, Lexi Xu, Shuguang Cui
2021 arXiv   pre-print
In this paper, we review a promising family of nonparametric Bayesian machine learning methods, i.e., Gaussian processes (GPs), and their applications in wireless communication.  ...  The expressiveness of the GP model using various interpretable kernel designs is surveyed, namely, stationary, non-stationary, deep, and multi-task kernels.  ...  SCALABLE DISTRIBUTED GAUSSIAN PROCESS The distributed Gaussian process (DGP) in wireless communication involves learning on distributed edge devices.  ... 
arXiv:2103.10134v3 fatcat:bhox7nbavvcb7lnzndu2zr44r4

Hybrid Bayesian Neural Networks with Functional Probabilistic Layers [article]

Daniel T. Chang
2021 arXiv   pre-print
and Gaussian process models.  ...  We discuss their foundations in functional Bayesian inference, functional variational inference, sparse Gaussian processes, and sparse variational Gaussian processes.  ...  For the second example, we add a second GPLayer, effectively using deep Gaussian processes: The predictions made by the model are shown below.  ... 
arXiv:2107.07014v1 fatcat:nxrsdrxklvgpffpalue3r3taam

Statistical Deep Learning for Spatial and Spatio-Temporal Data [article]

Christopher K. Wikle, Andrew Zammit-Mangion
2022 arXiv   pre-print
deep Gaussian processes.  ...  overview of traditional statistical and machine learning perspectives for modeling spatial and spatio-temporal data, and then focus on a variety of hybrid models that have recently been developed for latent process  ...  The authors would like to thank Yi Cao, Wanfang Chen, and Per Sidén, for help with implementing and running software for DeepKriging and deep GMRFs.  ... 
arXiv:2206.02218v1 fatcat:jbn4rszdxvcajgp3hsnr5i7bqa

Automated Assessment of Bone Age Using Deep Learning and Gaussian Process Regression

Tom Van Steenkiste, Joeri Ruyssinck, Olivier Janssens, Baptist Vandersmissen, Florian Vandecasteele, Pieter Devolder, Eric Achten, Sofie Van Hoecke, Dirk Deschrijver, Tom Dhaene
2018 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)  
It consists of a powerful combination of deep learning and Gaussian process regression.  ...  However, it has been described that the reference models leave room for interpretation leading to a large inter-observer and intra-observer variation.  ...  The proposed methodology of using Gaussian process regression for aggregating augmented deep learning results clearly improves the standard state-of-the-art deep learning performance.  ... 
doi:10.1109/embc.2018.8512334 pmid:30440486 fatcat:zbtrawwtyvf2hlnesoy3zswque

Deep Gaussian Processes: A Survey [article]

Kalvik Jakkala
2021 arXiv   pre-print
Furthermore, one particular research area is Deep Gaussian Processes (DGPs), it has improved substantially in the past decade.  ...  Most existing surveys focus on only one particular variant of Gaussian processes and their derivatives.  ...  Sparse Gaussian Processes address the computational and storage costs. And the feature extraction issue is addressed by Deep Gaussian Processes.  ... 
arXiv:2106.12135v1 fatcat:7ny4sg4zi5acbghimkfq7jh4gm

Stochastic Variational Deep Kernel Learning [article]

Andrew Gordon Wilson, Zhiting Hu, Ruslan Salakhutdinov, Eric P. Xing
2016 arXiv   pre-print
Specifically, we apply additive base kernels to subsets of output features from deep neural architectures, and jointly learn the parameters of the base kernels and deep network through a Gaussian process  ...  We show improved performance over stand alone deep networks, SVMs, and state of the art scalable Gaussian processes on several classification benchmarks, including an airline delay dataset containing 6  ...  Gaussian processes.  ... 
arXiv:1611.00336v2 fatcat:tdi46dwdejd3teezh3gkoqdjhm

Neuro-symbolic Neurodegenerative Disease Modeling as Probabilistic Programmed Deep Kernels [article]

Alexander Lavin
2021 arXiv   pre-print
Our Bayesian approach combines the flexibility of Gaussian processes with the structural power of neural networks to model biomarker progressions, without needing clinical labels for training.  ...  We present a probabilistic programmed deep kernel learning approach to personalized, predictive modeling of neurodegenerative diseases.  ...  Monotonic Gaussian Processes A GP is a stochastic process which is fully specified by its mean function and covariance function such that any finite set of random variables have a joint Gaussian distribution  ... 
arXiv:2009.07738v3 fatcat:snzegvlnyzdn3fibg56wsurbki

Implicit Priors for Knowledge Sharing in Bayesian Neural Networks [article]

Jack K Fitzsimons, Sebastian M Schmon, Stephen J Roberts
2019 arXiv   pre-print
Theoretically rooted in the concepts of Bayesian neural networks this work has widespread application to general deep learning.  ...  Bayesian interpretations of neural network have a long history, dating back to early work in the 1990's and have recently regained attention because of their desirable properties like uncertainty estimation  ...  Neural Networks as Gaussian Processes The Gaussian process interpretation of neural networks originates from early work by [10] where it is shown that an infinite width single layer neural network is  ... 
arXiv:1912.00874v1 fatcat:2jehspgl3bf4ziknlzomyxkyvi

Learning spectrograms with convolutional spectral kernels [article]

Zheyang Shen, Markus Heinonen, Samuel Kaski
2019 arXiv   pre-print
We introduce the convolutional spectral kernel (CSK), a novel family of non-stationary, nonparametric covariance kernels for Gaussian process (GP) models, derived from the convolution between two imaginary  ...  Observing through the lens of the spectrogram, we provide insight on the interpretability of deep models. We then infer the functional hyperparameters using scalable variational and MCMC methods.  ...  Deeply non- stationary Gaussian processes. In Bayesian Deep Learning workshop, Advances in Neural Informa- tion Processing Systems, 2017b. Y.-L. K. Samo.  ... 
arXiv:1905.09917v2 fatcat:5w3yuyxla5aaxo45x2cvldrqx4

Image Denoising with Control over Deep Network Hallucination

Liang Qiyuan, Cassayre Florian, Owsianko Haley, El Helou Majed, Süsstrunk Sabine
2022 IS&T International Symposium on Electronic Imaging Science and Technology  
For better control and interpretability over a deep denoiser, we propose a novel framework exploiting a denoising network. We call it controllable confidence-based image denoising (CCID).  ...  Results show that our CCID not only provides more interpretability and control, but can even outperform both the quantitative performance of the deep denoiser and that of the reliable filter, especially  ...  Deep networks are also inherently a black box, making it difficult to interpret their outputs.  ... 
doi:10.2352/ei.2022.34.14.coimg-217 fatcat:5lfhpcw5tvhjrn6w2unwrh7o3u
« Previous Showing results 1 — 15 out of 142,056 results