Improved Estimation of the Distance between Covariance Matrices

Malik Tiomoko, Romain Couillet, Eric Moisan, Steeve Zozor
2019 ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
A wide range of machine learning and signal processing applications involve data discrimination through covariance matrices. A broad family of metrics, among which the Frobenius, Fisher, Bhattacharyya distances, as well as the Kullback-Leibler or Rényi divergences, are regularly exploited. Not being directly accessible, these metrics are usually assessed through empirical sample covariances. We show here that, for large dimensional data, these approximations lead to dramatically erroneous
more » ... ce and divergence estimates. In this article, based on advanced random matrix considerations, we provide a novel and versatile consistent estimate for these covariance matrix distances and divergences. While theoretically developed for both large and numerous data, practical simulations demonstrate its large performance gains over the standard approach even for very small dimensions. A particular emphasis is made on the Fisher information metric and a concrete application to covariance-based spectral clustering is investigated. Index Terms-Covariance distance, random matrix theory, Fisher information metric.
doi:10.1109/icassp.2019.8682621 dblp:conf/icassp/TiomokoCMZ19 fatcat:tdudfgt6fncydpq3brch4taxbu