Right singular vector projection graphs: fast high dimensional covariance matrix estimation under latent confounding
In this work we consider the problem of estimating a high-dimensional $p \times p$ covariance matrix $\Sigma$, given $n$ observations of confounded data with covariance $\Sigma + \Gamma \Gamma^T$, where $\Gamma$ is an unknown $p \times q$ matrix of latent factor loadings. We propose a simple and scalable estimator based on the projection on to the right singular vectors of the observed data matrix, which we call RSVP. Our theoretical analysis of this method reveals that in contrast to PCA-based
... ntrast to PCA-based approaches, RSVP is able to cope well with settings where the smallest eigenvalue of $\Gamma^T \Gamma$ is close to the largest eigenvalue of $\Sigma$, as well as settings where the eigenvalues of $\Gamma^T \Gamma$ are diverging fast. It is also able to handle data that may have heavy tails and only requires that the data has an elliptical distribution. RSVP does not require knowledge or estimation of the number of latent factors $q$, but only recovers $\Sigma$ up to an unknown positive scale factor. We argue this suffices in many applications, for example if an estimate of the correlation matrix is desired. We also show that by using subsampling, we can further improve the performance of the method. We demonstrate the favourable performance of RSVP through simulation experiments and an analysis of gene expression datasets collated by the GTEX consortium.