Filters








12,402 Hits in 4.0 sec

Wasserstein Distance to Independence Models [article]

Türkü Özlüm Çelik, Asgar Jamneshan, Guido Montúfar, Bernd Sturmfels, Lorenzo Venturello
2020 arXiv   pre-print
Given any data distribution, we seek to minimize its Wasserstein distance to a fixed independence model. The solution to this optimization problem is a piecewise algebraic function of the data.  ...  An independence model for discrete random variables is a Segre-Veronese variety in a probability simplex.  ...  In Sections 4 and 5, we compute the vectors f and δ for Wasserstein distance to the independence models. Section 6 features numerical experiments.  ... 
arXiv:2003.06725v2 fatcat:e6mu2ny6vneyho4qbqheui5s64

Comparison of Maximum Likelihood and GAN-based training of Real NVPs [article]

Ivo Danihelka, Balaji Lakshminarayanan, Benigno Uria, Daan Wierstra, Peter Dayan
2017 arXiv   pre-print
We show that an independent critic trained to approximate Wasserstein distance between the validation set and the generator distribution helps detect overfitting.  ...  We then compare the generated samples, exact log-probability densities and approximate Wasserstein distances.  ...  It would be possible to use auto-regressive discriminators (Oord et al., 2016) to decompose the large KL divergence to multiple smaller terms:  ... 
arXiv:1705.05263v1 fatcat:tjxqumbrybenhgwzknqqqw3f3a

The Cramer Distance as a Solution to Biased Wasserstein Gradients [article]

Marc G. Bellemare, Ivo Danihelka, Will Dabney, Shakir Mohamed, Balaji Lakshminarayanan, Stephan Hoyer, Rémi Munos
2017 arXiv   pre-print
Leveraging insights from probabilistic forecasting we propose an alternative to the Wasserstein metric, the Cram\'er distance.  ...  The value of being sensitive to this geometry has been demonstrated, among others, in ordinal regression and generative modelling.  ...  how the Cramér distance compares to the 1-Wasserstein metric, we consider modelling the discrete distribution P depicted in Figure 1 (left).  ... 
arXiv:1705.10743v1 fatcat:rfgqmom5qfeq7mnea7fnxtmlbu

Orthogonal Estimation of Wasserstein Distances [article]

Mark Rowland and Jiri Hron and Yunhao Tang and Krzysztof Choromanski and Tamas Sarlos and Adrian Weller
2019 arXiv   pre-print
In this paper, we propose a new variant of sliced Wasserstein distance, study the use of orthogonal coupling in Monte Carlo estimation of Wasserstein distances and draw connections with stratified sampling  ...  Wasserstein distances are increasingly used in a wide variety of applications in machine learning.  ...  Due to the close connections between various Wasserstein distance measures, we propose to set D(·, ·) as either sliced Wasserstein distance or projected Wasserstein distance.  ... 
arXiv:1903.03784v2 fatcat:5va23hq3fffa5h6bjjzppfpwia

Gaussian Word Embedding with a Wasserstein Distance Loss [article]

Chi Sun, Hang Yan, Xipeng Qiu, Xuanjing Huang
2018 arXiv   pre-print
Therefore, with the aim of representing words in a highly efficient way, we propose to operate a Gaussian word embedding model with a loss function based on the Wasserstein distance.  ...  The Wasserstein distance provides a natural notion of dissimilarity with probability measures and has a closed-form solution when measuring the distance between two Gaussian distributions.  ...  embedding model (Vilnis and McCallum 2014) , Wasserstein Distance Gaussian model, and Wasserstein Distance Gaussian model with external information, respectively.  ... 
arXiv:1808.07016v7 fatcat:stvuzzcngbhrnm7iqfcet45kyy

Optimal Transport to a Variety [article]

T. Ö. Çelik, A. Jamneshan, G. Montúfar, B. Sturmfels, L. Venturello
2020 arXiv   pre-print
A detailed analysis is given for the two bit independence model.  ...  We study the problem of minimizing the Wasserstein distance between a probability distribution and an algebraic variety.  ...  What we are interested in is the minimum Wasserstein distance from µ to any point ν in the independence model M.  ... 
arXiv:1909.11716v2 fatcat:d4uetu6fvfapte3tjjsgbhi55i

Wasserstein Training of Restricted Boltzmann Machines

Grégoire Montavon, Klaus-Robert Müller, Marco Cuturi
2016 Neural Information Processing Systems  
We derive a gradient of that distance with respect to the model parameters. Minimization of this new objective leads to generative models with different statistical properties.  ...  This metric between observations can then be used to define the Wasserstein distance between the distribution induced by the Boltzmann machine on the one hand, and that given by the training sample on  ...  Correspondence to GM, KRM and MC.  ... 
dblp:conf/nips/MontavonMC16 fatcat:xjb3lsmp3jfz5hid4iq2bf6pja

On a prior based on the Wasserstein information matrix [article]

W. Li, F. J. Rubio
2022 arXiv   pre-print
We present sufficient conditions for the propriety of the posterior distribution for general classes of models.  ...  We introduce a prior for the parameters of univariate continuous distributions, based on the Wasserstein information matrix, which is invariant under reparameterisations.  ...  The last expression is proportional to the marginal likelihood associated to a normal sampling model together with the Wasserstein prior.  ... 
arXiv:2202.03217v3 fatcat:rv4b3krc3jgc3hdsjwa6ugqnau

Improved Image Wasserstein Attacks and Defenses [article]

J. Edward Hu, Adith Swaminathan, Hadi Salman, Greg Yang
2020 arXiv   pre-print
A recently proposed Wasserstein distance-bounded threat model is a promising alternative that limits the perturbation to pixel mass movements.  ...  Perturbations in the real-world, however, rarely exhibit the pixel independence that ℓ_p threat models assume.  ...  One reason behind this is the p threat model perturbs each pixel independently, while the Wasserstein threat model does not.  ... 
arXiv:2004.12478v1 fatcat:gvh3phuqovd5zi7jphbfy7taae

Reproducibility of radiomic features using network analysis and its application in Wasserstein k-means clustering

Jung Hun Oh, Aditya P. Apte, Evangelia Katsoulakis, Nadeem Riaz, Vaios Hatzoglou, Yao Yu, Usman Mahmood, Harini Veeraraghavan, Maryam Pouryahya, Aditi Iyer, Amita Shukla-Dave, Allen Tannenbaum (+2 others)
2021 Journal of Medical Imaging  
The reliability and reproducibility of those radiomic features were further validated on phantom data using the Wasserstein distance.  ...  an independent dataset.  ...  The use of unstable features in predictive modeling can lead to failure in validating the models on independent data.  ... 
doi:10.1117/1.jmi.8.3.031904 pmid:33954225 pmcid:PMC8085581 fatcat:wjg3br6zhffoblf6ixbdffpcue

OUP accepted manuscript

2019 Information and Inference A Journal of the IMA  
Our results are motivated by recent applications of minimum Wasserstein estimators to complex generative models.  ...  Statistical inference can be performed by minimizing, over the parameter space, the Wasserstein distance between model distributions and the empirical distribution of the data.  ...  If ζ goes to zero, the dual-Sinkhorn divergence goes to the Wasserstein distance. If ζ goes to infinity, it converges to the energy distance (Ramdas et al., 2017) .  ... 
doi:10.1093/imaiai/iaz003 fatcat:s6umcdcc6rfzbjrrd2borgpdpm

Wasserstein Neural Processes [article]

Andrew Carr and Jared Nielsen and David Wingate
2020 arXiv   pre-print
Neural Processes (NPs) are a class of models that learn a mapping from a context set of input-output pairs to a distribution over functions.  ...  We also show that this drawback is solved by using approximations of Wasserstein distance which calculates optimal transport distances even for distributions of disjoint support.  ...  Since the Wasserstein distance is defined independent of likelihood, the WNP model still finds the proper fit of parameters given the data as seen in (fig 2) .  ... 
arXiv:1910.00668v2 fatcat:63pnwx3fsnd6teaoktl6yk3o4e

Reproducibility test of radiomics using network analysis and Wasserstein K-means algorithm [article]

Jung Hun Oh, Aditya Apte, Evangelia Katsoulakis, Nadeem Riaz, Vaios Hatzoglou, Yao Yu, Jonathan Leeman, Usman Mahmood, Maryam Pouryahya, Aditi Iyer, Amita Shukla-Dave, Allen Tannenbaum (+2 others)
2019 bioRxiv   pre-print
Purpose: To construct robust and validated radiomic predictive models, the development of a reliable method that can identify reproducible radiomic features robust to varying image acquisition methods  ...  For phantom data, the Wasserstein distance on a largest common network component from the lung cancer data was much smaller than the Wasserstein distance on the same network using random radiomic features  ...  This has led many radiomic models built using a dataset to be unsuccessful in subsequent external validation on independent data [6] .  ... 
doi:10.1101/773168 fatcat:nnfsjrbjozgnzgdw5wjica4nt4

Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance [article]

Khoa D. Doan and Saurav Manchanda and Sarkhan Badirli and Chandan K. Reddy
2020 arXiv   pre-print
set of easy-to-compute one-dimensional Wasserstein distances.  ...  Specifically, by exploiting the desired properties of the hash function in the low-dimensional, discrete space, our method efficiently estimates a better variant of Wasserstein distance by averaging a  ...  Secondly, the OT-estimate of the Wasserstein distance requires an exponential number of samples to generalize (or to achieve a good estimate of the distance) [2] .  ... 
arXiv:2003.00134v3 fatcat:x3rwu23p3ndy3hzth37p27o4di

Permutation invariant networks to learn Wasserstein metrics [article]

Arijit Sehanobish, Neal Ravindra, David van Dijk
2021 arXiv   pre-print
In this work, we use a permutation invariant network to map samples from probability measures into a low-dimensional space such that the Euclidean distance between the encoded samples reflects the Wasserstein  ...  We show that our network can generalize to correctly compute distances between unseen densities.  ...  Acknowledgements The first author wants to thank Alexander Cloninger for helpful suggestions and for suggesting to study the geometry of the Wasserstein space by simple translations and scalings.  ... 
arXiv:2010.05820v4 fatcat:eald6qab2rg37ld5nm6gmr2pn4
« Previous Showing results 1 — 15 out of 12,402 results