29,441 Hits in 6.0 sec

Scalable Multi-Class Bayesian Support Vector Machines for Structured and Unstructured Data [article]

Martin Wistuba, Ambrish Rawat
2018 arXiv   pre-print
Furthermore, we develop hybrid Bayesian neural networks that combine standard deep learning components with the proposed model to enable learning for unstructured data.  ...  We introduce a new Bayesian multi-class support vector machine by formulating a pseudo-likelihood for a multi-class hinge loss in the form of a location-scale mixture of Gaussians.  ...  We compare the proposed multi-class SVM on 68 structured data sets to a state-of-the-art binary Bayesian SVM with the one-vs-rest approach and the scalable variational Gaussian process [12] .  ... 
arXiv:1806.02659v1 fatcat:wfxtslmabja5blip44hmeip74u

Deep Bayesian Active Learning, A Brief Survey on Recent Advances [article]

Salman Mohamadi, Hamidreza Amindavar
2022 arXiv   pre-print
Deep Bayesian active learning frameworks and generally any Bayesian active learning settings, provide practical consideration in the model which allows training with small data while representing the model  ...  In this paper, we briefly survey recent advances in Bayesian active learning and in particular deep Bayesian active learning frameworks.  ...  and acquisition functions for multiple tasks of natural language processing, and finally show that deep Bayesian active learning consistently provides the best performance.  ... 
arXiv:2012.08044v2 fatcat:g4oyahxjerg6vgomhg5jyutwbu

Variational Resampling Based Assessment of Deep Neural Networks under Distribution Shift [article]

Xudong Sun, Alexej Gossmann, Yu Wang, Bernd Bischl
2019 arXiv   pre-print
A novel variational inference based resampling framework is proposed to evaluate the robustness and generalization capability of deep learning models with respect to distribution shift.  ...  We use Auto Encoding Variational Bayes to find a latent representation of the data, on which a Variational Gaussian Mixture Model is applied to deliberately create distribution shift by dividing the dataset  ...  Variational Gaussian Mixture Model: Variational Learn- ing of Gaussian Mixture Models (VGMM) AVERAGE CLASSIFICATION ACCURACIES ON THE TRAINING, VALIDATION, AND TESTING DATA SPLITS AFTER 100 TRAINING  ... 
arXiv:1906.02972v6 fatcat:w4n4mb2zubatjocl6lkq5simeu

Priors in Bayesian Deep Learning: A Review [article]

Vincent Fortuin
2022 arXiv   pre-print
In this review, we highlight the importance of prior choices for Bayesian deep learning and present an overview of different priors that have been proposed for (deep) Gaussian processes, variational autoencoders  ...  While the choice of prior is one of the most critical parts of the Bayesian inference workflow, recent Bayesian deep learning models have often fallen back on vague priors, such as standard Gaussians.  ...  Luckily, a plethora of alternative prior choices is available for popular Bayesian deep learning models, such as (deep) Gaussian processes, variational autoencoders, and Bayesian neural networks.  ... 
arXiv:2105.06868v3 fatcat:dmra3u2ibzgrnblzsepjgrr6pm

Building Blocks for Variational Bayesian Learning of Latent Variable Models

Tapani Raiko, Harri Valpola, Markus Harva, Juha Karhunen
2007 Journal of machine learning research  
We introduce standardised building blocks designed to be used with variational Bayesian learning. The blocks include Gaussian variables, summation, multiplication, nonlinearity, and delay.  ...  Variational Bayesian learning provides a cost function which is used both for updating the variables of the model and for optimising the model structure.  ...  This research has been funded by the European Commission project BLISS and the Finnish Centre of Excellence Programme (2000)(2001)(2002)(2003)(2004)(2005) under the project New Information Processing Principles  ... 
dblp:journals/jmlr/RaikoVHK07 fatcat:4uqpt6lxpfbrpegqz5qgxsgpkq

Leveraging the Bayesian Filtering Paradigm for Vision-Based Facial Affective State Estimation

Meshia Oveneke, Isabel Gonzalez, Valentin Enescu, Dongmei Jiang, Hichem Sahli
2017 IEEE Transactions on Affective Computing  
We then pose the affective state estimation problem as a Bayesian filtering problem and provide a solution based on Kalman filtering (KF) for probabilistic reasoning over time, combined with multiple instance  ...  sparse Gaussian processes (MI-SGP) for inferring affect-related measurements from image sequences.  ...  Gaussian process (MI-SGP) for regression; (3) We augment the sparse Gaussian process (GP) with a multiple instance Hausdorff squared exponential covariance function to simultaneously cope with annotation  ... 
doi:10.1109/taffc.2016.2643661 fatcat:qfonpv4t6vetpah6np4klxqdn4

Machine-Learning-Driven New Geologic Discoveries at Mars Rover Landing Sites: Jezero and NE Syrtis [article]

Murat Dundar, Bethany L. Ehlmann, Ellen K. Leask
2019 arXiv   pre-print
A hierarchical Bayesian classifier is trained at pixel scale with spectral data from the CRISM (Compact Reconnaissance Imaging Spectrometer for Mars) imagery.  ...  Its utility in detecting rare phases is demonstrated with new geologic discoveries near the Mars-2020 rover landing site.  ...  the global CTX mosaic and other assistance with dataset registration.  ... 
arXiv:1909.02387v1 fatcat:joop6bhjnncqrenl6koagfdztm

Scalable Bayesian Non-linear Matrix Completion [article]

Xiangju Qin, Paul Blomstedt, Samuel Kaski
2019 arXiv   pre-print
We introduce a Bayesian non-linear matrix completion algorithm, which is based on a recent Bayesian formulation of Gaussian process latent variable models.  ...  To solve the challenges regarding scalability and computation, we propose a data-parallel distributed computational approach with a restricted communication scheme.  ...  To this end, we have introduced a computational scheme which leverages embarrassingly parallel techniques developed for Gaussian process regression by suitably adapting them for Bayesian Gaussian process  ... 
arXiv:1908.01009v1 fatcat:be4cprnrf5b4dehcexdhdnb43m

Illustrative Discussion of MC-Dropout in General Dataset: Uncertainty Estimation in Bitcoin

Ismail Alarab, Simant Prakoonwit, Mohamed Ikbal Nacer
2021 Neural Processing Letters  
Recently, Monte-Carlo dropout (MC-dropout) method has been introduced as a probabilistic approach based Bayesian approximation which is computationally efficient than Bayesian neural networks.  ...  On the other hand, we apply MC-dropout method on dataset derived from Bitcoin known as Elliptic data to highlight the outperformance of model with MC-dropout over standard model.  ...  In other words, Gaussian processes sample prior functions from a multivariate Gaussian distribution and update these priors with a newly collected data via Bayesian rule.  ... 
doi:10.1007/s11063-021-10424-x fatcat:pobv6bqan5e6vdfg6ptjjlagmq

A review of deterministic approximate inference techniques for Bayesian machine learning

Shiliang Sun
2013 Neural computing & applications (Print)  
A central task of Bayesian machine learning is to infer the posterior distribution of hidden random variables given observations and calculate expectations with respect to this distribution.  ...  Keywords Uncertainty · Probabilistic models · Bayesian machine learning · Posterior distribution · Deterministic approximate inference Introduction Uncertainty is one of the key concepts in modern artificial  ...  [2, 3] proposed the variational Gaussian process approximation for models with non-Gaussian stochastic process priors and Gaussian likelihoods, where the Gaussian and non-Gaussian processes are both  ... 
doi:10.1007/s00521-013-1445-4 fatcat:2tge4ca6vrbtrbiky3i5qdd3lm

Bayesian Perceptron: Towards fully Bayesian Neural Networks [article]

Marco F. Huber
2020 arXiv   pre-print
The weights and the predictions of the perceptron are considered Gaussian random variables.  ...  In this paper a novel approach towards fully Bayesian NNs is proposed, where training and predictions of a perceptron are performed within the Bayesian inference framework in closed-form.  ...  ., being gradient-free and enabling sequential learning, are considered highly beneficial when employing the BP as a building block of Bayesian NN with multiple layers.  ... 
arXiv:2009.01730v2 fatcat:xfd6flk67fedpey4wwdxeejpqq

Spike and Slab Variational Inference for Multi-Task and Multiple Kernel Learning

Michalis K. Titsias, Miguel Lázaro-Gredilla
2011 Neural Information Processing Systems  
We apply the method to a general multi-task and multiple kernel learning model in which a common set of Gaussian process functions is linearly combined with task-specific sparse weights, thus inducing  ...  We introduce a variational Bayesian inference algorithm which can be widely applied to sparse linear models.  ...  MKT was supported by EPSRC Grant No EP/F005687/1 "Gaussian Processes for Systems Identification with Applications in Systems Biology".  ... 
dblp:conf/nips/TitsiasL11 fatcat:lwokgwgdfbf3ras7lpckgsabze

All-Spin Bayesian Neural Networks [article]

Kezhou Yang, Akul Malhotra, Sen Lu, Abhronil Sengupta
2020 arXiv   pre-print
Probabilistic machine learning enabled by the Bayesian formulation has recently gained significant attention in the domain of automated reasoning and decision-making.  ...  In this paper, we propose an "All-Spin" Bayesian Neural Network where the underlying spintronic hardware provides a better match to the Bayesian computing models.  ...  Standard supervised backpropagation based learning techniques are unable to deal with such issues since they do not overtly represent uncertainty in the modelling process.  ... 
arXiv:1911.05828v4 fatcat:r6ycl6bfwrcvri2vdaefuweuqq

Patch Group Based Bayesian Learning for Blind Image Denoising [chapter]

Jun Xu, Dongwei Ren, Lei Zhang, David Zhang
2017 Lecture Notes in Computer Science  
We then employ nonparametric Bayesian dictionary learning to extract the latent clean structures from the PG variations.  ...  inferred by variational Bayesian method.  ...  Patch Group based Bayesian Dictionary Learning Truncated beta-Bernoulli Process for Dictionary Learning Once we have clustered similar PG variations into different components, we can extract the latent  ... 
doi:10.1007/978-3-319-54407-6_6 fatcat:zgj4nurpizebhdhw6jxjkvriiy

Learning Multiple Related Tasks using Latent Independent Component Analysis

Jian Zhang, Zoubin Ghahramani, Yiming Yang
2005 Neural Information Processing Systems  
We propose a probabilistic model based on Independent Component Analysis for learning multiple related tasks.  ...  On the other hand, from the general Bayesian perspective [2] [6] we could treat the problem of learning multiple tasks as learning a Bayesian prior over the task space.  ...  It uses Gaussian Processes (GP) to model regression through a latent factor analysis.  ... 
dblp:conf/nips/ZhangGY05 fatcat:fumdil6azjc7lmtirwbejxb5im
« Previous Showing results 1 — 15 out of 29,441 results