A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Sobolev GAN
[article]
2017
arXiv
pre-print
Finally we show that a variant of Sobolev GAN achieves competitive results in semi-supervised learning on CIFAR-10, thanks to the smoothness enforced on the critic by Sobolev GAN which relates to Laplacian ...
The Sobolev IPM compares the mean discrepancy of two distributions for functions (critic) restricted to a Sobolev ball defined with respect to a dominant measure μ. ...
Figure 5 : 5 Comparison of annealed versus non annealed smoothing of P r in Sobolev GAN. We see that annealed smoothing outperforms the non annealed smoothing experiments. ...
arXiv:1711.04894v1
fatcat:dj4uti3f2zc3xcrsdczlxsfd6m
Generative Modeling by Estimating Gradients of the Data Distribution
[article]
2020
arXiv
pre-print
Our models produce samples comparable to GANs on MNIST, CelebA and CIFAR-10 datasets, achieving a new state-of-the-art inception score of 8.87 on CIFAR-10. ...
We introduce a new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching. ...
Acknowledgements Toyota Research Institute ("TRI") provided funds to assist the authors with their research but this article solely reflects the opinions and conclusions of its authors and not TRI or any ...
arXiv:1907.05600v3
fatcat:s4pkocgofvgrbk6ifqtwsxf73u
Adversarial score matching and improved sampling for image generation
[article]
2020
arXiv
pre-print
In addition, we propose two improvements to DSM-ALS: 1) Consistent Annealed Sampling as a more stable alternative to Annealed Langevin Sampling, and 2) a hybrid training formulation, composed of both Denoising ...
Despite the convincing visual quality of samples, this method appears to perform worse than Generative Adversarial Networks (GANs) under the Fr\'echet Inception Distance, a standard metric for generative ...
ANNEALED LANGEVIN SAMPLING Given a score function, one can use Langevin dynamics (or Langevin sampling) (Welling and Teh, 2011) to sample from the corresponding probability distribution. ...
arXiv:2009.05475v2
fatcat:dxzvvcketvbahpiraq64cnwux4
Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification
2016
2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Learning the representation of shape cues in 2D & 3D objects for recognition is a fundamental task in computer vision. Deep neural networks (DNNs) have shown promising performance on this task. ...
Due to the large variability of shapes, accurate recognition relies on good estimates of model uncertainty, ignored in traditional training of DNNs, typically learned via stochastic optimization. ...
A regularizer imposed on the model parameters can be viewed as the log of a prior on the distribution of the parameters, with such a prior connected to a Bayesian perspective. ...
doi:10.1109/cvpr.2016.611
dblp:conf/cvpr/LiSCPGC16
fatcat:oexvaelzebg4zhq34ffmgip4di
Iterative Reconstruction for Low-Dose CT using Deep Gradient Priors of Generative Model
[article]
2021
arXiv
pre-print
At the stage of prior learning, the gradient of data density is directly learned from normal-dose CT images as a prior. ...
Then at the iterative reconstruction stage, the stochastic gradient descent is employed to update the trained prior with annealed and conditional schemes. ...
McCollough of the Mayo Clinic, Rochester, MN, USA, for providing clinical projection data, as agreed under the American Association of Physicists in Medicine. ...
arXiv:2009.12760v2
fatcat:uniekyvc5bbubebztfoiooi57y
Source Separation with Deep Generative Priors
[article]
2020
arXiv
pre-print
This paper introduces a Bayesian approach to source separation that uses generative models as priors over the components of a mixture of sources, and noise-annealed Langevin dynamics to sample from the ...
posterior distribution of sources given a mixture. ...
learned by a GAN. ...
arXiv:2002.07942v2
fatcat:jqumlhr66rdxpgf5baf4x24fzi
Learning Gradient Fields for Shape Generation
[article]
2020
arXiv
pre-print
A point cloud can be viewed as samples from a distribution of 3D points whose density is concentrated near the surface of the shape. ...
We show that our method can reach state-of-the-art performance for point cloud auto-encoding and generation, while also allowing for extraction of a high-quality implicit surface. ...
Thus, this annealed Langevin dynamics can be thought of as a coarse-to-fine refinement of the shape. ...
arXiv:2008.06520v2
fatcat:pot37vogyrgczbe2eelpnyhaba
MRI Reconstruction Using Deep Energy-Based Model
[article]
2021
arXiv
pre-print
Simultaneously, implicit inference with Langevin dynamics is a unique property of re-construction. ...
of a generative model. ...
By assembling the discussions in the above subsections, the final formulation of EBMRec model at each iteration of the annealed Langevin dynamics can be formulated as follows: 2 2 t P x Min F x y x x ...
arXiv:2109.03237v2
fatcat:s23takonkfaxpdr4lvlpizinw4
A Wasserstein Minimum Velocity Approach to Learning Unnormalized Models
[article]
2020
arXiv
pre-print
In this paper, we present a scalable approximation to a general family of learning objectives including score matching, by observing a new connection between these objectives and Wasserstein gradient flows ...
We present applications with promise in learning neural density estimators on manifolds, and training implicit variational and Wasserstein auto-encoders with a manifold-valued prior. ...
In this work, we present a unifying perspective to this problem, and derive scalable approximations for a variety of learning objectives including score matching. ...
arXiv:2002.07501v1
fatcat:ptisyprqujbovmolstr7t6xbzm
Diagnostic Visualization for Deep Neural Networks Using Stochastic Gradient Langevin Dynamics
[article]
2018
arXiv
pre-print
LDAM provides two affordances in combination: the ability to explore the set of maximally activating pre-images, and the ability to trade-off interpretability and pixel-level accuracy using a GAN-style ...
discriminator as a regularizer. ...
Acknowledgements We gratefully acknowledge the support of NVIDIA Corporation with the donation of the Titan X GPU used for this research. ...
arXiv:1812.04604v1
fatcat:s3slzm2mmfhojb47avhrvnxscu
Learning to Draw Samples with Amortized Stein Variational Gradient Descent
[article]
2017
arXiv
pre-print
We demonstrate our method with a number of applications, including variational autoencoder (VAE) with expressive encoders to model complex latent space structures, and hyper-parameter learning of MCMC ...
Our method is based on iteratively adjusting the neural network parameters so that the output changes along a Stein variational gradient direction (Liu & Wang, 2016) that maximally decreases the KL divergence ...
We thank Yingzhen Li from University of Cambridge for her valuable comments and feedbacks. ...
arXiv:1707.06626v2
fatcat:czlv76pwdfdwddhvwwvwanv334
Generative Modeling by Inclusive Neural Random Fields with Applications in Image Generation and Anomaly Detection
[article]
2020
arXiv
pre-print
Different from various directed graphical models such as generative adversarial networks (GANs), NRFs provide an interesting family of undirected graphical models for generative modeling. ...
Remarkably, in addition to superior sample generation, one additional benefit of our inclusive-NRF approach is that, unlike GANs, it can directly provide (unnormalized) density estimate for sample evaluation ...
After training, NCSN uses an annealed Langevin dynamics to produce descent images, using a total of 10 × 1000 Langevin steps with 10 noise levels. ...
arXiv:1806.00271v5
fatcat:kvo3vg3ayjfcjjpmntveptk6pm
Probabilistic Mapping of Dark Matter by Neural Score Matching
[article]
2020
arXiv
pre-print
In this work, we present a novel methodology for addressing such inverse problems by combining elements of Bayesian statistics, analytic physical theory, and a recent class of Deep Generative Models based ...
By measuring this lensing effect on a large number of galaxies it is possible to reconstruct maps of the Dark Matter distribution on the sky. ...
This procedure is similar to the Annealed Langevin diffusion proposed in Song and Ermon (2019). ...
arXiv:2011.08271v1
fatcat:gpqiobtx2naxzifni37qqwndo4
Deep Generative Learning via Schrödinger Bridge
[article]
2021
arXiv
pre-print
, suggesting a new formulation of generative learning. ...
Experimental results on multimodal synthetic data and benchmark data support our theoretical findings and indicate that the generative model via Schrödinger Bridge is comparable with state-of-the-art GANs ...
GANs, suggesting a new formulation of generative learning. ...
arXiv:2106.10410v2
fatcat:domn5aacezb57ay2ppmpi6s5xq
Bridging Explicit and Implicit Deep Generative Models via Neural Stein Estimators
[article]
2021
arXiv
pre-print
To take full advantages of both models and enable mutual compensation, we propose a novel joint training framework that bridges an explicit (unnormalized) density estimator and an implicit sample generator ...
There are two types of deep generative models: explicit and implicit. ...
Variational annealing of gans: A langevin perspective. In ICML, pages 6176-6185, 2019. [43] Cédric Villani. Optimal transport: old and new, volume 338. Springer Science & Business Media, 2008. ...
arXiv:1909.13035v3
fatcat:hjl4e5k7gvgypb5uojqn4kccga
« Previous
Showing results 1 — 15 out of 78 results