Concentration inequalities under sub-Gaussian and sub-exponential conditions

Andreas Maurer, Massimiliano Pontil
2021 Neural Information Processing Systems  
We prove analogues of the popular bounded difference inequality (also called McDiarmid's inequality) for functions of independent random variables under sub-Gaussian and sub-exponential conditions. Applied to vector-valued concentration and the method of Rademacher complexities these inequalities allow an easy extension of uniform convergence results for PCA and linear regression to the case of potentially unbounded input-and output variables.
dblp:conf/nips/MaurerP21 fatcat:jauktyu6sbfe5lfqbnud7gdpba