3,387 Hits in 5.3 sec

Efficient inference in matrix-variate Gaussian models with \iid observation noise

Oliver Stegle, Christoph Lippert, Joris M. Mooij, Neil D. Lawrence, Karsten M. Borgwardt
2011 Neural Information Processing Systems  
Here, we discuss an approach for efficient inference in such models that explicitly account for iid observation noise.  ...  Inference in matrix-variate Gaussian models has major applications for multioutput prediction and joint learning of row and column covariances from matrixvariate data.  ...  Here, we address these shortcomings and propose a general framework for efficient inference in matrix-variate normal models that include iid observation noise.  ... 
dblp:conf/nips/StegleLMLB11 fatcat:wh3f675hqfed3pqaq3ry6vdhje

Variational Bayesian inference of hidden stochastic processes with unknown parameters [article]

Komlan Atitey, Pavel Loskot, Lyudmila Mihaylova
2019 arXiv   pre-print
In particular, a random process generated by the autoregressive moving average (ARMA) linear model is inferred from non-linearity noise observations.  ...  This paper adopts a machine learning approach to devise variational Bayesian inference for such scenarios.  ...  Since the noise samples are IID, we can write, ∼ ∏ =1 Υ ( ; , ). (8) Thus, non-linear observations in (1) represent bivariate transformation of the Gaussian and gamma distributed hidden random variables  ... 
arXiv:1911.00757v1 fatcat:oowed2t3xrco5aks55ndibcm6i

Using flexible noise models to avoid noise model misspecification in inference of differential equation time series models [article]

Richard Creswell, Ben Lambert, Chon Lok Lei, Martin Robinson, David Gavaghan
2020 arXiv   pre-print
observed variation to the signal.  ...  When modelling time series, it is common to decompose observed variation into a "signal" process, the process of interest, and "noise", representing nuisance factors that obfuscate the signal.  ...  The growth parameter, r, was most affected by incorrectly assuming IID Gaussian noise, where the IID noise model resulted in estimates with overly inflated uncertainty.  ... 
arXiv:2011.04854v1 fatcat:us5cnocq4rh6hbkyykr43ann6m

It is all in the noise: Efficient multi-task Gaussian process inference with structured residuals

Barbara Rakitsch, Christoph Lippert, Karsten M. Borgwardt, Oliver Stegle
2013 Neural Information Processing Systems  
The resulting Gaussian model has a covariance term in form of a sum of Kronecker products, for which efficient parameter inference and out of sample prediction are feasible.  ...  On both synthetic examples and applications to phenotype prediction in genetics, we find substantial benefits of modeling structured noise compared to established alternatives.  ...  Efficient Inference In general, efficient inference can be carried out for Gaussian models with a sum covariance of two arbitrary Kronecker products p(vecY | C, R, Σ) = N (vecY | 0, C T T ⊗ R N N + Σ T  ... 
dblp:conf/nips/RakitschLBS13 fatcat:4d7t2y4dd5ghnc5k6f3qc25k5y

Bayesian Out-Trees [article]

Tony S. Jebara
2012 arXiv   pre-print
The latent graph structure is assumed to lie in the family of directed out-tree graphs which leads to efficient Bayesian inference.  ...  This novel likelihood subsumes iid likelihood, is exchangeable and yields efficient unsupervised and semi-supervised learning algorithms.  ...  In the last example, the matrix Σ cc was reduced to sample a spiral with less noise.  ... 
arXiv:1206.3269v1 fatcat:gdqlxmtn3zcu3phnytbzxwtlge

Black box variational inference for state space models [article]

Evan Archer, Il Memming Park, Lars Buesing, John Cunningham, Liam Paninski
2015 arXiv   pre-print
A few highly-structured models, such as the linear dynamical system with linear-Gaussian observations, have closed-form inference procedures (e.g. the Kalman Filter), but this case is an exception to the  ...  Here, we extend recent developments in stochastic variational inference to develop a 'black-box' approximate inference technique for latent variable models with latent dynamical structure.  ...  Observations are coupled to the latents through a loading matrix C, x t = Cz t + η t , (23) and η t are Gaussian noise with diagonal covariance.  ... 
arXiv:1511.07367v1 fatcat:c2jtepr5r5aufo6327cr5d2m5a

Continuous-time modeling of random searches: statistical properties and inference

Reiichiro Kawai
2012 Journal of Physics A: Mathematical and Theoretical  
Among the proposed models, the Brownian motion is most tractable in various ways while its Gaussianity and infinite variation of sample paths do not well describe the reality.  ...  To address this issue, in this paper, we propose to model the continuoustime search paths directly with a continuous-time stochastic process for which the observer makes statistical inference based on  ...  Moments of Increments In the case of the Brownian motion, the increments are iid Gaussian with N (∆γ, ∆Σ), where Σ is the variancecovariance matrix defined by (2.4).  ... 
doi:10.1088/1751-8113/45/23/235004 fatcat:n3dayckkuvdshaoqoyyqrk7pju

Gaussian Processes for Independence Tests with Non-iid Data in Causal Inference

Seth R. Flaxman, Daniel B. Neill, Alexander J. Smola
2015 ACM Transactions on Intelligent Systems and Technology  
We highlight the important issue of non-iid observations: when data are observed in space, time, or on a network, "nearby" observations are likely to be similar.  ...  In the case of real-valued iid data, linear dependencies, and Gaussian error terms, partial correlation is sufficient.  ...  Fig. 3 . 3 Draws from a Gaussian Process posterior with Gaussian RBF kernel after observations at {(−1, 1), (0, 0), (1, 1)}. Left: Noise-free observations. Right: Noisy observations with σ 2 = 0.2.  ... 
doi:10.1145/2806892 fatcat:ohpwu5zmz5fibbbzi7inwhpfge

A Variational Bayesian Inference-Inspired Unrolled Deep Network for MIMO Detection [article]

Qian Wan, Jun Fang, Yinsen Huang, Huiping Duan, Hongbin Li
2021 arXiv   pre-print
To address these issues, in this paper, we develop a modeldriven DL detector based on variational Bayesian inference.  ...  a significant performance improvement over the OAMPNet and MMNet in the presence of noise variance uncertainty.  ...  Our proposed networks, referred to as the variational Bayesian inference-inspired network (VBINet), have a very few learnable parameters and thus can be efficiently trained with a moderate number of training  ... 
arXiv:2109.12275v1 fatcat:pbslhqb6frgdnpuz3jeoenrzy4

Active Probabilistic Inference on Matrices for Pre-Conditioning in Stochastic Optimization [article]

Filip de Roos, Philipp Hennig
2019 arXiv   pre-print
with strong Gaussian noise.  ...  We propose an iterative algorithm inspired by classic iterative linear solvers that uses a probabilistic model to actively infer a pre-conditioner in situations where Hessian-projections can only be constructed  ...  build a probabilistic Gaussian inference model for matrices from noisy matrix-vector products (Section 2.1) by extending existing work on matrix-variate Gaussian inference.  ... 
arXiv:1902.07557v1 fatcat:fapfj4vvlvayro2ubxnffkzs2m

Matrix Normal PCA for Interpretable Dimension Reduction and Graphical Noise Modeling [article]

Chihao Zhang, Kuo Gai, Shihua Zhang
2021 arXiv   pre-print
To address this challenge, some variants of PCA for data with non-IID noise have been proposed.  ...  To this end, we propose a powerful and intuitive PCA method (MN-PCA) through modeling the graphical noise by the matrix normal distribution, which enables us to explore the structure of noise in both the  ...  First, it can be extended to matrix variate data, where one has multiple matrix variate observations.  ... 
arXiv:1911.10796v2 fatcat:5adaz5gjqnhxjcaunc3r62jnim

Improving Real-Time Position Estimation Using Correlated Noise Models

Andrew Martin, Matthew Parry, Andy W. R. Soundy, Bradley J. Panckhurst, Phillip Brown, Timothy C. A. Molteno, Daniel Schumayer
2020 Sensors  
The noise model is capable of capturing the observed autocorrelated process noise in the altitude, latitude and longitude recordings.  ...  This model outperforms a KF that assumes a Gaussian noise model, which under-reports the position uncertainties.  ...  distributed (iid) Gaussian noise model [8] .  ... 
doi:10.3390/s20205913 pmid:33092018 fatcat:y2vcr76mhfevbixqetcgrhgkoe

Large-scale collaborative prediction using a nonparametric random effects model

Kai Yu, John Lafferty, Shenghuo Zhu, Yihong Gong
2009 Proceedings of the 26th Annual International Conference on Machine Learning - ICML '09  
The model is nonparametric in the sense that the dimensionality of random effects is not specified a priori but is instead determined from data.  ...  An approach to estimating the model is presented uses an EM algorithm that is efficient on a very large scale collaborative prediction problem.  ...  & Mnih, 2008a) . • PMF-VB: probabilistic matrix factorization using a variational Bayes method for inference (Lim & Teh, 2007) .  ... 
doi:10.1145/1553374.1553525 dblp:conf/icml/YuLZG09 fatcat:kw5twqw5urc5zmvzirjkyg3mdq

Spectral Methods for Indian Buffet Process Inference

Hsiao-Yu Fish Tung, Alexander J. Smola
2014 Neural Information Processing Systems  
We give a computationally efficient iterative inference algorithm, concentration of measure bounds, and reconstruction guarantees.  ...  We provide an efficient spectral algorithm as an alternative to costly Variational Bayes and sampling-based algorithms.  ...  In particular, [24] make a number of alternative assumptions on p(y), namely either that it is iid Gaussian or that it is iid Laplacian.  ... 
dblp:conf/nips/TungS14 fatcat:tjvlgxktc5bmhpzct3wdw4zjsy

Communicate to Learn at the Edge [article]

Deniz Gunduz, David Burth Kurka, Mikolaj Jankowski, Mohammad Mohammadi Amiri, Emre Ozfatura, Sreejith Sreekumar
2020 arXiv   pre-print
Moreover, edge devices are connected through bandwidth- and power-limited wireless links that suffer from noise, time-variations, and interference.  ...  Information and coding theory have laid the foundations of reliable and efficient communications in the presence of channel imperfections, whose application in modern wireless networks have been a tremendous  ...  In [13] , sparsification of model updates is proposed followed by linear projection with a pseudo-random Gaussian matrix.  ... 
arXiv:2009.13269v1 fatcat:t6dcbiwzffbrveyyfzyzstb5ea
« Previous Showing results 1 — 15 out of 3,387 results