On the convergence of stochastic approximations under a subgeometric ergodic Markov dynamic

Vianney Debavelaere, Stanley Durrleman, Stéphanie Allassonnière
2021 Electronic Journal of Statistics  
In this paper, we extend the framework of the convergence of stochastic approximations. Such a procedure is used in many methods such as parameters estimation inside a Metropolis Hastings algorithm, stochastic gradient descent or stochastic Expectation Maximization algorithm. It is given by where (Xn) n∈N is a sequence of random variables following a parametric distribution which depends on (θn) n∈N , and (Δn) n∈N is a step sequence. The convergence of such a stochastic approximation has
more » ... been proved under an assumption of geometric ergodicity of the Markov dynamic. However, in many practical situations this hypothesis is not satisfied, for instance for any heavy tail target distribution in a Monte Carlo Metropolis Hastings algorithm. In this paper, we relax this hypothesis and prove the convergence of the stochastic approximation by only assuming a subgeometric ergodicity of the Markov dynamic. This result opens up the possibility to derive more generic algorithms with proven convergence. As an example, we first study an adaptive Markov Chain Monte Carlo algorithm where the proposal distribution is adapted by learning the variance of a heavy tail target distribution. We then apply our work to the Independent Component Analysis when a positive heavy tail noise leads to a subgeometric dynamic in an Expectation Maximization algorithm. MSC2020 subject classifications: Primary 62L20, 60J05; secondary 90C15.
doi:10.1214/21-ejs1827 fatcat:kgmkrmv7nzfoff7y5oziwyuk6a