A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is `application/pdf`

.

##
###
Nearly Non-Expansive Bounds for Mahalanobis Hard Thresholding

2020
*
Annual Conference Computational Learning Theory
*

Given a vector w ∈ R p and a positive semi-definite matrix A ∈ R p×p , we study the expansion ratio bound for the following defined Mahalanobis hard thresholding operator of w: where k ≤ p is the desired sparsity level. The core contribution of this paper is to prove that for any k-sparse vector w with k < k, the estimation error where κ(A, 2k) is the restricted strong condition number of A over (2k)-sparse subspace. This estimation error bound is nearly non-expansive when k is sufficiently

dblp:conf/colt/Yuan020
fatcat:uu4lxden4rayznizreumewvp7y