A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Parameterizing Region Covariance: An Efficient Way To Apply Sparse Codes On Second Order Statistics
[article]
2016
arXiv
pre-print
We present an approach to transform a structured sparse model learning problem to a traditional vectorized sparse modeling problem by constructing a Euclidean space representation for region covariance ...
Currently there is a trend to learn sparse models directly on structure data, such as region covariance. However, such methods when combined with region covariance often require complex computation. ...
We present an approach for sparse coding parameterized representations of region covariance matrices inspired by finance applications. ...
arXiv:1602.02822v1
fatcat:iq5rri56fzfrrnsypaftppfndq
On the role of ML estimation and Bregman divergences in sparse representation of covariance and precision matrices
[article]
2018
arXiv
pre-print
Therefore, the major design challenge is to introduce adequate problem formulations and offer solutions that will efficiently lead to desired representations. ...
In this context, sparse representation of covariance and precision matrices, which appear as feature descriptors or mixture model parameters, respectively, will be in the main focus of this paper. ...
On the other hand, when used in the role of feature descriptor, each covariance represents a set of second order statistics that are collected over some local, e.g. spatial or temporal window or region ...
arXiv:1810.11718v1
fatcat:de4s4g4bxvh7liqsoilmydjkby
Covariances, Robustness, and Variational Bayes
[article]
2018
arXiv
pre-print
Mean-field Variational Bayes (MFVB) is an approximate Bayesian posterior inference technique that is increasingly popular due to its fast runtimes on large-scale datasets. ...
The estimates for MFVB posterior covariances rely on a result from the classical Bayesian robustness literature relating derivatives of posterior expectations to posterior covariances and include the Laplace ...
Consequently, we recommend fitting models
using second-order Newton trust region methods. ...
arXiv:1709.02536v3
fatcat:maff66lvyfg3fnttszi6b5wyeu
Model-based Clustering with Sparse Covariance Matrices
[article]
2018
arXiv
pre-print
To this end, we introduce mixtures of Gaussian covariance graph models for model-based clustering with sparse covariance matrices. ...
Model estimation is carried out using a structural-EM algorithm for parameters and graph structure estimation, where two alternative strategies based on a genetic algorithm and an efficient stepwise search ...
On the other hand, the code implementing our proposed method is written in pure R (which is known to be slower than compiled languages) and, although much care and effort have been put for an efficient ...
arXiv:1711.07748v2
fatcat:pjsv52gq4nes7iyy6czyauq23y
Covariance Estimation: The GLM and Regularization Perspectives
2011
Statistical Science
Finding an unconstrained and statistically interpretable reparameterization of a covariance matrix is still an open problem in statistics. ...
It reduces the unintuitive task of covariance estimation to that of modeling a sequence of regressions at the cost of imposing an a priori order among the variables. ...
and statistically interpretable reparameterization exists for a general covariance matrix without imposing an order on the variables. ...
doi:10.1214/11-sts358
fatcat:hwtcgdjd4jgt5lafc4xnebc5pe
Model-based clustering with sparse covariance matrices
2018
Statistics and computing
To this end, we introduce mixtures of Gaussian covariance graph models for model-based clustering with sparse covariance matrices. ...
Model estimation is carried out using a structural-EM algorithm for parameters and graph structure estimation, where two alternative strategies based on a genetic algorithm and an efficient stepwise search ...
On the other hand, the code implementing our proposed method is written in pure R (which is known to be slower than compiled languages) and, although much care and effort have been put for an efficient ...
doi:10.1007/s11222-018-9838-y
fatcat:sje3mzj3b5azloftl7djfoujje
Background error covariance estimation for atmospheric CO2data assimilation
2013
Journal of Geophysical Research - Atmospheres
The resulting error statistics: (1) vary regionally and seasonally to better capture the uncertainty in the background CO 2 field, and (2) have a positive impact on the analysis estimates by allowing observations ...
characterizing background error statistics on atmospheric CO 2 concentration estimates. ...
in order to make efficient use of the observational information. ...
doi:10.1002/jgrd.50654
fatcat:j5r2rpo2nbhu3lynwe7d7v4rf4
Spatial modelling using a new class of nonstationary covariance functions
2006
Environmetrics
More complicated models may require simpler parameterizations for computational efficiency. ...
The class allows one to knit together local covariance parameters into a valid global nonstationary covariance, regardless of how the local covariance structure is estimated. ...
The fourth method (gam) also fits the spatial surface using a thin plate spline and GCV, but in a computationally efficient way (Wood 2003) , coded in the gam() function in the mgcv library in R. ...
doi:10.1002/env.785
pmid:18163157
pmcid:PMC2157553
fatcat:fqpz2555yvgvzhci27fbz4pt3y
Simulations and cosmological inference: A statistical model for power spectra means and covariances
2008
Physical Review D
The model is calibrated using an extension of the framework in Habib et al. (2007) to train Gaussian processes for the power spectrum mean and covariance given a set of simulation runs over a hypercube ...
We describe an approximate statistical model for the sample variance distribution of the non-linear matter power spectrum that can be calibrated from limited numbers of simulations. ...
The computations for this paper were carried out using the Scythe Statistical Library [39] . ...
doi:10.1103/physrevd.78.063529
fatcat:4kbhe6f3ffeeta2lmlbpwla37y
Covariance Trees for 2D and 3D Processing
2014
2014 IEEE Conference on Computer Vision and Pattern Recognition
We start by explaining how to construct a Covariance Tree from a subset of the input data, how to enrich its statistics from a larger set in a streaming process, and how to query it efficiently, at any ...
Gaussian Mixture Models have become one of the major tools in modern statistical image processing, and allowed performance breakthroughs in patch-based image denoising and restoration problems. ...
However, progress in this area has been slow because we lack the required tools for efficiently handling the massive amounts of data that need to be fed to learn these models in order to approach optimal ...
doi:10.1109/cvpr.2014.78
dblp:conf/cvpr/GuillemotAB14
fatcat:v7eq3yr5dvbb7nqdd7niao3ogm
Distributed Algorithms for Computing Very Large Thresholded Covariance Matrices
2016
ACM Transactions on Knowledge Discovery from Data
This suggests an obvious way to address the computational difficulty as well: first, compute the identities of the K entires in the covariance matrix that are actually important in the sense that they ...
than the number of covariances to compute. ...
The second problem is computational. An entryŝ j,k in the empirical covariance matrixŜ is computed as: s j,k = 1 n i x i,j x i,k − 1 n i x i,j i x i,k . ...
doi:10.1145/2935750
fatcat:l7hmam4m2fcrheaz7odf5utoqi
Shape Matters: Understanding the Implicit Bias of the Noise Covariance
[article]
2020
arXiv
pre-print
We show that in an over-parameterized setting, SGD with label noise recovers the sparse ground-truth with an arbitrary initialization, whereas SGD with Gaussian noise or gradient descent overfits to dense ...
This paper theoretically characterizes this phenomenon on a quadratically-parameterized model introduced by Vaskevicius et el. and Woodworth et el. ...
CW acknowledges support from an NSF Graduate Research Fellowship. JDL acknowledges support of the ARO under MURI Award W911NF-11-1-0303, the Sloan Research Fellowship, and NSF CCF 2002272. ...
arXiv:2006.08680v2
fatcat:g2jg27ryybbpzi2kxh5p7cy3ny
Reverberant signal separation using optimized complex sparse nonnegative tensor deconvolution on spectral covariance matrix
2018
Digital signal processing (Print)
Unlike conventional methods, the proposed model decomposition is performed directly on the statistics in the form of spectral covariance matrix instead of the data itself (i.e. the mixed signal). ...
In addition, the proposed Gamma-Exponential process can be used to initialize the parameterization of the CNTF2D. ...
complex matrix factorization on the spectral covariance matrices where the latter has to be constructed by computing the first and second order statistics of the data spectrogram as shown in Section 3.2 ...
doi:10.1016/j.dsp.2018.07.018
fatcat:x5lvxsgfsnhbxobvd4rhyzye64
Bayesian Functional Generalized Additive Models with Sparsely Observed Covariates
[article]
2017
arXiv
pre-print
Our approach allows the functional covariates to be sparsely observed and measured with error, whereas the estimation procedure of McLean et al. (2013) required that they be noiselessly observed on a regular ...
In a real data analysis, our methods are applied to forecasting closing price for items up for auction on the online auction website eBay. ...
on the sparse trajectories. ...
arXiv:1305.3585v2
fatcat:v22eh56g5zdejmjuakezyrseeu
Impact of Removing Covariance Localization in an Ensemble Kalman Filter: Experiments with 10 240 Members Using an Intermediate AGCM
2016
Monthly Weather Review
This study modified the EnKF code to save memory and enabled for the first time the removal of completely covariance localization with an intermediate AGCM. ...
The authors' previous work pioneered implementation of an EnKF with a large ensemble of up to 10 240 members, but this method required application of a relatively broad covariance localization to avoid ...
The SPEEDY-LETKF code is publicly available online (http://code.google.com/p/miyoshi/). We are grateful to the members of the Data Assimilation Research Team, RIKEN AICS, for fruitful discussions. ...
doi:10.1175/mwr-d-15-0388.1
fatcat:l34ddszconegbcanwwawhajghy
« Previous
Showing results 1 — 15 out of 5,660 results