A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Reliable Categorical Variational Inference with Mixture of Discrete Normalizing Flows
[article]
2021
arXiv
pre-print
Variational approximations are increasingly based on gradient-based optimization of expectations estimated by sampling. Handling discrete latent variables is then challenging because the sampling process is not differentiable. Continuous relaxations, such as the Gumbel-Softmax for categorical distribution, enable gradient-based optimization, but do not define a valid probability mass for discrete observations. In practice, selecting the amount of relaxation is difficult and one needs to
arXiv:2006.15568v2
fatcat:delvtnvefrflno5tyt7bzpd2ei