A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is
The unified dominance results are applied to improving on an empirical Bayes estimator of a high-dimensional covariance matrix. ... The problem of estimating a normal covariance matrix is considered from a decision-theoretic point of view, where the dimension of the covariance matrix is larger than the sample size. ... Acknowledgments The work is supported by Grant-in-Aid for Scientific Research (15K00055), Japan. ...arXiv:1506.00748v1 fatcat:gwvrqgqik5bqjjdjav2a56valu
We establish that the estimator is both high-dimensionally consistent and minimax optimal in the symmetrized Stein loss. ... We study the problem of high-dimensional covariance estimation under the constraint that the partial correlations are nonnegative. ... As a loss, L ssym treats the dual problems of estimating the covariance matrix and the precision matrix equally. ...arXiv:2007.15252v1 fatcat:aujs7m25jvcetjpumfm4yzy6ke
As the main focus, we apply this method to Stein's loss. Compared to the estimator of Stein (Estimation of a covariance matrix (1975); J. Math. ... This paper introduces a new method for deriving covariance matrix estimators that are decisiontheoretically optimal within a class of nonlinear shrinkage estimators. ... Acknowledgements We thank an associate editor and two anonymous referees for helpful comments that have greatly enhanced the exposition of the paper. ...doi:10.2139/ssrn.2264903 fatcat:6bmupekb4nh3tlkhqh2xo3icye
In this paper, we review some recent developments in the estimation of variances, covariance matrix, and precision matrix, with emphasis on the applications to microarray data analysis. ... Estimation of variances and covariances is required for many statistical methods such as t-test, principal component analysis and linear discriminant analysis. ... The authors thank the editor, the associate editor, and two referees for their constructive comments that led to a substantial improvement of this review article. ...doi:10.1002/wics.1308 fatcat:kw3irf5bojdztoaq5sbgqcujku
We also prove consistency in the high dimensional setting under the symmetrized Stein loss. ... This paper considers the problem of estimating high dimensional Laplacian constrained precision matrices by minimizing Stein's loss. ... Acknowledgements The author thanks Antonio Ortega for useful discussions. This work was funded by NSF under grants CCF-1410009 and CCF-2009032. ...arXiv:2111.00590v2 fatcat:xstaloilgvgwbegds7u5jsudvi
Let X be a ¢-dimensional normal vector with mean @ and covari- ance matrix 07], where o? is unknown and / is the ¢ x ¢ identity matrix. ... The author considers the problem of estimating the location of a p- dimensional random vector based on a sample of size n. ...
Shrinkage approaches for estimating a high-dimensional covariance matrix are often employed to circumvent the limitations of the sample covariance matrix. ... In simulations, the proposed Stein-type shrinkage covariance matrix estimator based on a scaled identity matrix appeared to be up to 80% more efficient than existing ones in extreme high-dimensional settings ... Discussion We proposed a new family of nonparametric Stein-type shrinkage estimators for a high-dimensional covariance matrix. ...doi:10.1016/j.csda.2014.10.018 fatcat:l36lr3fbc5b7nflzxbggds2noy
In this paper, a shrinkage estimator for the population mean is proposed under known quadratic loss functions with unknown covariance matrices. ... Keywords: High-dimensional data; Shrinkage estimator; Large p small n; U-statistic. ... The authors thank the editor, the associate editor, and two reviewers for their helpful comments and suggestions that have substantially improved the paper. Conflict of Interest: None declared. ...doi:10.1016/j.jmva.2013.12.012 fatcat:lospxszkdvewnag63bbwyl3nnm
We shrink the centroids of clusters toward the overall mean of all data using a James-Stein-type adjustment, and then the James-Stein shrinkage estimators act as the new centroids in the next clustering ... Monte Carlo simulation shows that the magnitude of the improvement depends on the within-cluster variance and especially on the effective dimension of the covariance matrix. ... Bock Suppose an observation X is distributed according to the p-dimensional multivariate normal distribution with mean vector θ and covariance matrix Q, where Q is a symmetric positive definite covariance ...doi:10.1016/j.csda.2010.03.018 fatcat:hbdzvjp2c5dahg232be4uq3i2y
Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. ... In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. ... Acknowledgments We thank the editor and the associate editor for useful comments that improved the presentation of the paper. J. ...doi:10.1111/j.1467-9868.2012.01049.x pmid:23730197 pmcid:PMC3667751 fatcat:3fp6geqdvnb4vair2a4rq2ejra
Summary: “We consider a family of Stein-rule estimators for estimating the coefficient vector of a linear regression model with the covariance matrix depending on few unknown parameters. ... [Chaturvedi, Anoop] (6-ALL); Shukla, Govind (6-ALL) Stein rule estimation in linear model with nonscalar error covariance matrix. (English summary) Sankhya Ser. B 52 (1990), no. 3, 293-304. ...
Annals of Statistics
We consider the problem of estimating the mean vector of a p-variate normal (θ,Σ) distribution under invariant quadratic loss, (δ-θ)'Σ^-1(δ-θ), when the covariance is unknown. ... The proposed estimators of θ depend upon X and an independent Wishart matrix S with n degrees of freedom, however, S is singular almost surely when p>n. ... The authors are grateful to the Associate Editor and referees for helpful comments that strengthened the exposition and scope of this paper. ...doi:10.1214/12-aos1067 fatcat:eabqqhgwabclzbze4shsda3am4
such a likelihood function, in the other, with a penalty that constrains the trace of the sample covariance matrix. ... Two new orthogonally equivariant estimators of the covariance matrix are proposed. ... Wolf for providing us with the Matlab functions for their nonlinear shrinkage estimators. SB was supported by CTSC grant UL1-RR024996 and MTW by NSF grant DMS-1208488 for their efforts. ...doi:10.1016/j.jspi.2016.06.004 fatcat:vrn4lexlfzexdbfccojxfejm4y
High dimensional inverse covariance matrix estimation via linear programming. ... This paper considers ridge-type shrinkage estimation of a large dimensional precision matrix. The asymptotic optimal shrinkage coefficients and the theoretical loss are derived. ... Lixing Zhu's research was supported by a grant from the Research Grants Council of Hong Kong and a Faculty Research Grant (FRG) from Hong Kong Baptist University. ...doi:10.5705/ss.2012.328 fatcat:au3tl3agsrdrxn6sfmoyjzetya
The estimation of the large and high-dimensional covariance matrix and precision matrix is a fundamental problem in modern multivariate analysis. ... However, the traditional sample estimators perform poorly for large and high-dimensional data. There are many approaches to improve covariance matrix estimation. ... Under Stein loss, an optimal nonlinear shrinkage estimator for large-dimensional covariance matrix is proposed  . ...doi:10.1109/access.2020.3031192 fatcat:52jtyv6c6beqrhxr5fxooxvqcy
« Previous Showing results 1 — 15 out of 4,412 results