Nearly Non-Expansive Bounds for Mahalanobis Hard Thresholding

Xiao-Tong Yuan, Ping Li
2020 Annual Conference Computational Learning Theory  
Given a vector w ∈ R p and a positive semi-definite matrix A ∈ R p×p , we study the expansion ratio bound for the following defined Mahalanobis hard thresholding operator of w: where k ≤ p is the desired sparsity level. The core contribution of this paper is to prove that for any k-sparse vector w with k < k, the estimation error where κ(A, 2k) is the restricted strong condition number of A over (2k)-sparse subspace. This estimation error bound is nearly non-expansive when k is sufficiently
more » ... er than k. Specially when A is the identity matrix such that κ(A, 2k) ≡ 1, our bound recovers the previously known nearly non-expansive bounds for Euclidean hard thresholding operator. We further show that such a bound extends to an approximate version of H A,k (w) estimated by Hard Thresholding Pursuit (HTP) algorithm. We demonstrate the applicability of these bounds to the mean squared error analysis of HTP and its novel extension based on preconditioning method. Numerical evidence is provided to support our theory and demonstrate the superiority of the proposed preconditioning HTP algorithm.
dblp:conf/colt/Yuan020 fatcat:uu4lxden4rayznizreumewvp7y