137,640 Hits in 5.0 sec

Variational Inference based on Robust Divergences [article]

Futoshi Futami, Issei Sato, Masashi Sugiyama
2018 arXiv   pre-print
In this paper, based on Zellner's optimization and variational formulation of Bayesian inference, we propose an outlier-robust pseudo-Bayesian variational method by replacing the Kullback-Leibler divergence  ...  used for data fitting to a robust divergence such as the beta- and gamma-divergences.  ...  Robust and efficient estimation by minimising a density power divergence. Biometrika, 85 (3):549-559, 1998.  ... 
arXiv:1710.06595v2 fatcat:fbuypjea5rg65bkuqjbkm2gnx4

Collaborative filtering recommendation algorithm based on variational inference

Kai Zheng, Xianjun Yang, Yilei Wang, Yingjie Wu, Xianghan Zheng
2020 International Journal of Crowd Science  
Based on the aforementioned analysis, this paper uses variational auto-encoder to construct a generating network, which can restore user-rating data to solve the problem of poor robustness and over-fitting  ...  Because the variational inference training denotes the probability distribution of the hidden vector, the problem of poor robustness and overfitting is alleviated.  ...  Collaborative filtering recommendation algorithm based on variational inference Most existing collaborative filtering algorithms have poor robustness and overfitting problems with the expansion of the  ... 
doi:10.1108/ijcs-10-2019-0030 fatcat:ulpjrrifbjhdhktuq2nifra4sq

Guess First to Enable Better Compression and Adversarial Robustness [article]

Sicheng Zhu, Bang An, Shiyu Niu
2020 arXiv   pre-print
In this paper, we try to leverage one of the mechanisms in human recognition and propose a bio-inspired classification framework in which model inference is conditioned on label hypothesis.  ...  Machine learning models are generally vulnerable to adversarial examples, which is in contrast to the robustness of humans.  ...  tasks via f-divergence variational estimation.  ... 
arXiv:2001.03311v1 fatcat:aiasa4uv3befbi2fejl7x3gb3e

Unified Robust Semi-Supervised Variational Autoencoder

Xu Chen
2021 International Conference on Machine Learning  
Moreover, a robust divergence measure is employed to further enhance the robustness, where a novel variational lower bound is derived and optimized to infer the network parameters.  ...  Typically, the uncertainty of of input data is characterized by placing the uncertainty prior on the parameters of probability density distributions in order to ensure the robustness of the variational  ...  Variational Lower Bound on β-Divergence for URSVAE In order to infer the network parameters for robust optimization, we shall now derive the variational lower bound based on β-divergence (β-ELBO) for our  ... 
dblp:conf/icml/Chen21 fatcat:unwl62leffgutftkpan7zmzone

Shedding Light on the Grey Zone of Speciation along a Continuum of Genomic Divergence

Camille Roux, Christelle Fraïsse, Jonathan Romiguier, Yoann Anciaux, Nicolas Galtier, Nicolas Bierne, Craig Moritz
2016 PLoS Biology  
Shedding Light on the Grey Zone of Speciation PLOS Biology |  ...  French National Research Agency (ANR) en/project-based-funding-to-advance-frenchresearch/ (grant number ANR-12-BSV7-0011). HYSEA project.  ...  The robustness of the inference-i.e., the probability to correctly support model M if true-obviously depends on X.  ... 
doi:10.1371/journal.pbio.2000234 pmid:28027292 pmcid:PMC5189939 fatcat:5s6sl4szr5getj6zyl62j7gfsq

Doubly Robust Bayesian Inference for Non-Stationary Streaming Data with \beta-Divergences

Jeremias Knoblauch, Jack Jewson, Theodoros Damoulas
2018 Neural Information Processing Systems  
We present the first robust Bayesian Online Changepoint Detection algorithm through General Bayesian Inference (GBI) with β-divergences.  ...  Secondly, we give a principled way of choosing the divergence parameter β by minimizing expected predictive loss on-line.  ...  TD is funded by the Lloyds Register Foundation programme on Data Centric Engineering through the London Air Quality project.  ... 
dblp:conf/nips/KnoblauchJD18 fatcat:judybt5duzgkxipwlk3zhqpqtu

Robust Variational Autoencoder [article]

Haleh Akrami, Anand A. Joshi, Jian Li, Sergul Aydore, Richard M. Leahy
2019 arXiv   pre-print
Our robust VAE is based on beta-divergence rather than the standard Kullback-Leibler (KL) divergence.  ...  We demonstrate the performance of our β-divergence based autoencoder for a range of image datasets, showing improved robustness to outliers both qualitatively and quantitatively.  ...  Our Contributions We propose a novel robust VAE (RVAE) using robust variational inference [11] that uses a β−ELBO based cost function.  ... 
arXiv:1905.09961v2 fatcat:wsiny2pjqndpva6zpxtvhexbte

Alpha-Beta Divergence For Variational Inference [article]

Jean-Baptiste Regli, Ricardo Silva
2018 arXiv   pre-print
It also gives access to objective functions never exploited before in the context of variational inference.  ...  and robustness to outliers in the data.  ...  Recently, they have been used to develop a robust pseudo variational inference method (Futami et al., 2017) .  ... 
arXiv:1805.01045v2 fatcat:ss7gw6fsxjfw3ebom6okysv6a4

Wasserstein Variational Inference [article]

Luca Ambrogioni, Umut Güçlü, Yağmur Güçlütürk, Max Hinne, Eric Maris, Marcel A. J. van Gerven
2018 arXiv   pre-print
This paper introduces Wasserstein variational inference, a new form of approximate Bayesian inference based on optimal transport theory.  ...  Wasserstein variational inference uses a new family of divergences that includes both f-divergences and the Wasserstein distance as special cases.  ...  Connections with related methods In the previous sections we showed that variational inference based on f-divergences is a special case of Wasserstein variational inference.  ... 
arXiv:1805.11284v2 fatcat:jqu4g4ldhvchfi2buqhscjnaly

Generalized Variational Inference: Three arguments for deriving new Posteriors [article]

Jeremias Knoblauch, Jack Jewson, Theodoros Damoulas
2019 arXiv   pre-print
In contrast, approximations based on alternative ELBO-like objectives violate the axioms. Finally, we study a special case of the RoT that we call Generalized Variational Inference (GVI).  ...  GVI posteriors are a large and tractable family of belief distributions specified by three arguments: A loss, a divergence and a variational family.  ...  TD acknowledges funding from EPSRC grant EP/T004134/1, the Lloyd's Register Foundation programme on Data Centric Engineering, and the London Air Quality project at the Alan Turing Institute for Data Science  ... 
arXiv:1904.02063v4 fatcat:557jlks43vg55mkkvzhgg3mmz4

Dual Adversarial Variational Embedding for Robust Recommendation [article]

Qiaomin Yi, Ning Yang, Philip S. Yu
2021 arXiv   pre-print
One is based on noise injection, and the other is to adopt the generative model Variational Auto-encoder (VAE). However, the existing works still face two challenges.  ...  The extensive experiments conducted on real datasets verify the effectiveness of DAVE on robust recommendation.  ...  Inspired by the success of Variational Auto-encoder (VAE) in computer vision, one line of the existing works on robust recommendation captures user preference by latent embeddings generated from VAE based  ... 
arXiv:2106.15779v1 fatcat:3776g6w36rfjnnsmhpu3dgfuha

Shedding light on the grey zone of speciation along a continuum of genomic divergence [article]

Camille Roux, Christelle Fraisse, Jonathan Romiguier, Yoann Anciaux, Nicolas Galtier, Nicolas Bierne
2016 bioRxiv   pre-print
The speciation genomic literature, however, is mainly a collection of case studies, each with its own approach and specificities, such that a global view of the gradual process of evolution from one to  ...  Thanks to appropriate modeling of among-loci variation in genetic drift and introgression rate, we clarify the status of the majority of ambiguous cases and uncover a number of cryptic species.  ...  The robustness of the inference, i.e. , the probability to correctly support model M if true, obviously depends on X.  ... 
doi:10.1101/059790 fatcat:px6cxvfh6zdmpcyerdzeg42wui

Variational Kalman Filtering with Hinf-Based Correction for Robust Bayesian Learning in High Dimensions [article]

Niladri Das, Jed A. Duersch, Thomas A. Catanach
2022 arXiv   pre-print
The VIF approach, based on mean-field Gaussian variational inference, reduces this burden through the variational approximation to the covariance usually in the form of a diagonal covariance approximation  ...  In this paper, we address the problem of convergence of sequential variational inference filter (VIF) through the application of a robust variational objective and Hinf-norm based correction for a linear  ...  We assume a prior on θ ∼ N (0, Σ θ0 ). Under the model (II), the optimal update scheme based on Bayesian inference is to use the Kalman Filter.  ... 
arXiv:2204.13089v1 fatcat:ttosagcdyff4nnuz4eq56ilzee

A Deep Variational Convolutional Neural Network for Robust Speech Recognition in the Waveform Domain [article]

Dino Oglic and Zoran Cvetkovic and Peter Sollich
2020 arXiv   pre-print
We rely on a probabilistic parametrization of the proposed architecture and learn the model using stochastic variational inference.  ...  This requires evaluation of an analytically intractable integral defining the Kullback-Leibler divergence term responsible for regularization, for which we propose an effective approximation based on the  ...  Recently, an approach for modulation filter-learning based on encoderdecoder architecture and variational inference has been considered in [3] and [4] .  ... 
arXiv:1906.09526v3 fatcat:36xb63fxwrcs5l53enaezvlwqy

Divergence measures for statistical data processing—An annotated bibliography

Michèle Basseville
2013 Signal Processing  
This note provides a bibliography of investigations based on or related to divergence measures for theoretical and applied inference problems.  ...  Mesures de distance pour le traitement statistique de données Résumé : Cette note contient une bibliographie de travaux concernant l'utilisation de divergences dans des problèmes relatifsà l'inférence  ...  How to handle divergences between more than two distributions is addressed in section 5. Section 6 concentrates on statistical inference based on entropy and divergence criteria.  ... 
doi:10.1016/j.sigpro.2012.09.003 fatcat:i5ki4ziujvf7hawvj663cqqzcu
« Previous Showing results 1 — 15 out of 137,640 results