Filters








17,964 Hits in 8.0 sec

Unified improvements in estimation of a normal covariance matrix in high and low dimensions

Hisayuki Tsukuma, Tatsuya Kubokawa
2016 Journal of Multivariate Analysis  
The problem of estimating a covariance matrix in multivariate linear regression models is addressed in a decision-theoretic framework.  ...  Although a standard loss function is the Stein loss, it is not available in the case of a high dimension.  ...  Our primary interest is in estimation of the covariance matrix Σ based on (V , X) and in derivation of unified dominance results irrespective of order of n, p and m in a decision-theoretic framework.  ... 
doi:10.1016/j.jmva.2015.09.016 fatcat:owb6zo36xng73iflfvbmhktd5a

Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

Marc G. Genton, David E. Keyes, George Turkiyyah
2018 Journal of Computational And Graphical Statistics  
The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure.  ...  This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations  ...  The particular structure we exploit is the hierarchically low rank nature of the formally dense covariance matrix, where matrix blocks in the off-diagonal regions admit a low rank approximation.  ... 
doi:10.1080/10618600.2017.1375936 fatcat:rj5apti36fbozenaa7zvkdizuy

Unified Principal Component Analysis with generalized Covariance Matrix for face recognition

Shiguang Shan, Bo Cao, Yu Su, Laiyun Qing, Xilin Chen, Wen Gao
2008 2008 IEEE Conference on Computer Vision and Pattern Recognition  
Each element of GCM is a generalized covariance of two random vectors rather than two scalar variables in CM.  ...  In this paper, some efforts are made to discover the underlying fundaments of these methods, and a novel framework called Unified Principal Component Analysis (UPCA) is proposed.  ...  Despite its success, one main drawback limits its usability: it is difficult to estimate the covariance matrix stably due to the high dimension of the image vectors and the relatively small size of the  ... 
doi:10.1109/cvpr.2008.4587375 dblp:conf/cvpr/ShanCSQCG08 fatcat:6o5icxddkvggbonxpvhj2dsmfa

Joint dimension reduction and clustering analysis for single-cell RNA-seq and spatial transcriptomics data [article]

Wei Liu, Xu Liao, Xiang Zhou, Xingjie Shi, Jin Liu
2021 bioRxiv   pre-print
Here, we develop a computation method, DR-SC, to perform both dimension reduction and (spatial) clustering jointly in a unified framework.  ...  However, the low-dimensional embeddings estimated in the dimension reduction step may not necessarily be relevant to the class labels inferred in the clustering step and thus may impair the performance  ...  In contrast, the expression levels of genes Fgfr2 and Fgfr3 changed from low to high and then to low.  ... 
doi:10.1101/2021.12.25.474153 fatcat:er3zaijdhnadzktd3suvfsv3gu

Unified eigen analysis on multivariate Gaussian based estimation of distribution algorithms

Weishan Dong, Xin Yao
2008 Information Sciences  
Multivariate Gaussian models are widely adopted in continuous Estimation of Distribution Algorithms (EDAs), and covariance matrix plays the essential role in guiding the evolution.  ...  Unlike classical EDAs, ED-EDA focuses on eigen analysis of the covariance matrix, and it explicitly tunes the eigenvalues.  ...  If a normal pdf is used, sample mean and covariance matrix are estimated in the same way as in EMNA global .  ... 
doi:10.1016/j.ins.2008.01.021 fatcat:kx44fkyww5gzvibquie2jzfvfa

Scale matrix estimation under data-based loss in high and low dimensions [article]

Mohamed Anis Haddouche, Dominique Fourdrinier, Fatiha Mezoued
2020 arXiv   pre-print
We adopt a unified approach of the two cases where S is invertible (p ≤ m) and S is non-invertible (p>m).  ...  We consider the problem of estimating the scale matrix Σ of the additif model Y_p× n = M + E, under a theoretical decision point of view.  ...  We evaluate numerically the performance of the alternative estimatorΣ ao,G in (9) where a o = 1/p and t = 2 (m − 1)/(p − m + 1), through the percentage relative improvement in average loss PRIAL of Σ  ... 
arXiv:2006.00243v1 fatcat:mz7d7squnnhgvftykrhk73kq2a

A Bayesian nonparametric semi-supervised model for integration of multiple single-cell experiments [article]

Archit Verma, Barbara E Engelhardt
2020 bioRxiv   pre-print
We demonstrate that the semi-supervised t-distributed Gaussian process latent variable model (sstGPLVM), which projects the data onto a mixture of fixed and latent dimensions, can learn a unified low-dimensional  ...  Manifold alignment is a principled, effective tool for integrating multiple data sets and controlling for confounding factors.  ...  First, the pairwise distance matrix of the cell embeddings in both the estimated and simulated low-dimensional space is calculated. Each row is normalized to sum to one.  ... 
doi:10.1101/2020.01.14.906313 fatcat:htsstup2njd4xhf667snjogo4e

Unified subspace analysis for face recognition

Xiaogang Wang, Xiaoou Tang
2003 Proceedings Ninth IEEE International Conference on Computer Vision  
This eventually leads to the construction of a 3D parameter space that uses three subspace dimensions as axis.  ...  Using the face difference model and a detailed subspace analysis on the three components we develop a unified framework for subspace analysis.  ...  ACKNOWLEDGMENT The work described in this paper was fully supported by grants from the Research Grants Council of the Hong Kong Special Administrative Region. (Project no.  ... 
doi:10.1109/iccv.2003.1238413 dblp:conf/iccv/WangT03 fatcat:kalk56gotrcfzgxuvd2puneef4

Robust coding schemes for indexing and retrieval from large face databases

Chengjun Liu, H. Wechsler
2000 IEEE Transactions on Image Processing  
The unifying theme of the new schemes is that of lowering the space dimension ("data compression") subject to increased fitness for the discrimination index.  ...  Learning to recognize visual objects, such as human faces, requires the ability to find meaningful patterns in spaces of very high dimensionality.  ...  The second step of PRM is then to estimate the within-class density and under the normal probability distribution assumption this step is equivalent to estimate the within-class covariance matrices (Eq  ... 
doi:10.1109/83.817604 pmid:18255378 fatcat:hga2ubuwkfbu5e6noknzhh2ani

Covariance Estimation From Compressive Measurements Using Alternating Minimization

José Bioucas-Dias, Deborah Cohen, Yonina Eldar
2014 Zenodo  
Publication in the conference proceedings of EUSIPCO, Lisbon, Portugal, 2014  ...  In order to properly exploit prior information in covariance estimation we introduce a class of convex formulations and respective solutions to the high-dimensional covariance matrix estimation problem  ...  In comparison to the popular Kronecker-based vector formulation, it results in problems of smaller dimension and yields improved performance particularly in low signal to noise ratio (SNR) regimes.  ... 
doi:10.5281/zenodo.44051 fatcat:gt7nv3c6wnggfpl6clnibiyohi

Improved Calibration of High-Dimensional Precision Matrices

Mengyi Zhang, Francisco Rubio, Daniel P. Palomar
2013 IEEE Transactions on Signal Processing  
In order to obtain well-behaved estimators in high-dimensional settings, we consider a general class of estimators of covariance matrices and precision matrices based on weighted sampling and linear shrinkage  ...  Estimation of a precision matrix (i.e., the inverse covariance matrix) is a fundamental problem in statistical signal processing applications.  ...  The parameters and in the precision matrix estimator (1) represent a set of degrees-of-freedom with respect to which estimation performance can be improved.  ... 
doi:10.1109/tsp.2012.2236321 fatcat:f653gazisrgetmbephjg5dhgcm

Unified Framework to Regularized Covariance Estimation in Scaled Gaussian Models

Ami Wiesel
2012 IEEE Transactions on Signal Processing  
We propose a unified framework for regularizing this estimate in order to improve its finite sample performance. Our approach is based on the discovery of hidden convexity within the ML objective.  ...  These allow for shrinkage towards the identity matrix, shrinkage towards a diagonal matrix, shrinkage towards a given positive definite matrix, and regularization of the condition number.  ...  Chen and A. O. Hero, III, for numerous discussions on the topic which led to this work. In particular, Y. Chen provided the majorization-minimization interpretation of Tyler's fixed point iteration.  ... 
doi:10.1109/tsp.2011.2170685 fatcat:dsegehcqlvaqdm7tovbcbfjnzu

Bayesian Face Revisited: A Joint Formulation [chapter]

Dong Chen, Xudong Cao, Liwei Wang, Fang Wen, Jian Sun
2012 Lecture Notes in Computer Science  
In this paper, we revisit the classical Bayesian face recognition method by Baback Moghaddam et al. and propose a new joint formulation.  ...  Our method achieved 92.4% test accuracy on the challenging Labeled Face in Wild (LFW) dataset. Comparing with current best commercial system, we reduced the error rate by 10%.  ...  First, suppose the face is represented as a d-dimensional feature, in the naive formulation, we need to estimate the covariance matrix in higher dimension (2d) feature space of [x 1 x 2 ].  ... 
doi:10.1007/978-3-642-33712-3_41 fatcat:ifwxpkzllrelhmng75lzxopjo4

A low-rank based estimation-testing procedure for matrix-covariate regression [article]

Hung Hung, Zhi-Yu Jou
2016 arXiv   pre-print
sparse effects and low-rank effects of matrix-covariates are identified, respectively.  ...  In this work, we overcome the problem of high-dimensionality by utilizing the inherent structure of the matrix-covariate.  ...  By reporting η, we identify a sparse effect of G×G in the PSQI data, while a low-rank effect is detected in the EEG data. The matrix-covariate discussed in this work is an order-two tensor.  ... 
arXiv:1607.02957v1 fatcat:4cqtutnqzjadrdq5uyad4zv4za

Human Detection by Quadratic Classification on Subspace of Extended Histogram of Gradients

Amit Satpathy, Xudong Jiang, How-Lung Eng
2014 IEEE Transactions on Image Processing  
ExHoG alleviates the problem of discrimination between a dark object against a bright background and vice versa inherent in HG.  ...  By investigating the limitations of Histogram of Gradients (HG) and Histogram of Oriented Gradients (HOG), ExHoG is proposed as a new feature for human detection.  ...  This results in a difference in the reliability of the estimated covariance matrices which makes Principal Component Analysis ineffective to remove unreliable dimensions.  ... 
doi:10.1109/tip.2013.2264677 pmid:23708804 fatcat:3oukezyqtzavhl4klfmsxdyxi4
« Previous Showing results 1 — 15 out of 17,964 results