Filters








30 Hits in 2.1 sec

Using probabilistic programs as proposals [article]

Marco F. Cusumano-Towner, Vikash K. Mansinghka
2018 arXiv   pre-print
By convergence of standard self-normalized importance sampling,μ N a.s. → E (z,τ 1:K )∼π(·) [f (z)] = E z∼π(·) [f (z)] = µ.  ...  If π(z) is a target distribution, P is a proposal program such that π(z) > 0 =⇒ p(z; x) > 0 for all x, f is a function where E z∼π [f (z)] = µ < ∞, thenμ N a.s. → µ whereμ N is the importance sampling  ... 
arXiv:1801.03612v2 fatcat:i6tjdve2brdpjjgxcy2365leky

Quantifying the probable approximation error of probabilistic inference programs [article]

Marco F Cusumano-Towner, Vikash K Mansinghka
2016 arXiv   pre-print
arange(num_particles)); likelihoods = map( (model_trace) -> likelihood(model_trace, x) model_traces); k = categorical(normalize(likelihoods)); y = (model_traces, k); z = model_traces[k]; return (y, z) }; (f)  ...  = (x_coord) -> { a * x_coord + b }; ), observation_model: map( (i) -> [|normal(f(x_coordinates[$i]), 0.3)|], arange(11))) (d) Model program (model expressions bold) Figure 1 : 1 Estimating subjective  ... 
arXiv:1606.00068v1 fatcat:w6x2t62u2najhda3qwiozta4om

Encapsulating models and approximate inference programs in probabilistic modules [article]

Marco F. Cusumano-Towner, Vikash K. Mansinghka
2017 arXiv   pre-print
This paper introduces the probabilistic module interface, which allows encapsulation of complex probabilistic models with latent variables alongside custom stochastic approximate inference machinery, and provides a platform-agnostic abstraction barrier separating the model internals from the host probabilistic inference system. The interface can be seen as a stochastic generalization of a standard simulation and density interface for probabilistic primitives. We show that sound approximate
more » ... ence algorithms can be constructed for networks of probabilistic modules, and we demonstrate that the interface can be implemented using learned stochastic inference networks and MCMC and SMC approximate inference programs.
arXiv:1612.04759v2 fatcat:a6q5x5qchjg7lpcr7mtsnccks4

Probabilistic programs for inferring the goals of autonomous agents [article]

Marco F. Cusumano-Towner, Alexey Radul, David Wingate, Vikash K. Mansinghka
2017 arXiv   pre-print
Tree and car 3D models in figures are from http://www. f-lohmueller.de/ [Lohmüller, 2016] .  ...  j (z π G (j) )) (2) The Metropolis-Hastings acceptance ratio is: α = j∈H∪A p tj (z j ; f j (z π G (j) )) j∈H∪A p tj (z j ; f j (z π G (j) )) · m(z i ; z ) j∈H p tj (z j ; f j (z π G (j) )) m(z i ; z) j  ...  ∈H p tj (z j ; f j (z π G (j) )) = m(z i ; z ) j∈A p tj (z j ; f j (z π G (j) )) m(z i ; z) j∈A p tj (z j ; f j (z π G (j) )) (3) We illustrate Cascading Resimulation MH in Figure 2 , on the task of inferring  ... 
arXiv:1704.04977v2 fatcat:4iholbzhobchja2lad55fhojvu

AIDE: An algorithm for measuring the accuracy of probabilistic inference algorithms [article]

Marco F. Cusumano-Towner, Vikash K. Mansinghka
2017 arXiv   pre-print
Approximate probabilistic inference algorithms are central to many fields. Examples include sequential Monte Carlo inference in robotics, variational inference in machine learning, and Markov chain Monte Carlo inference in statistics. A key problem faced by practitioners is measuring the accuracy of an approximate inference algorithm on a specific data set. This paper introduces the auxiliary inference divergence estimator (AIDE), an algorithm for measuring the accuracy of approximate inference
more » ... algorithms. AIDE is based on the observation that inference algorithms can be treated as probabilistic models and the random variables used within the inference algorithm can be viewed as auxiliary variables. This view leads to a new estimator for the symmetric KL divergence between the approximating distributions of two inference algorithms. The paper illustrates application of AIDE to algorithms for inference in regression, hidden Markov, and Dirichlet process mixture models. The experiments show that AIDE captures the qualitative behavior of a broad class of inference algorithms and can detect failure modes of inference algorithms that are missed by standard heuristics.
arXiv:1705.07224v2 fatcat:3otnhj6qajbrhpdofztz7jdgui

Bringing clothing into desired configurations with limited perception

Marco Cusumano-Towner, Arjun Singh, Stephen Miller, James F. O'Brien, Pieter Abbeel
2011 2011 IEEE International Conference on Robotics and Automation  
P (c a,gt |a, g t ) = 1 f + 1 d if c a,gt < f 1 f + 1 d e −dca,g t if c a,gt ≥ f. This distribution, shown in V.  ... 
doi:10.1109/icra.2011.5980327 dblp:conf/icra/Cusumano-TownerSMOA11 fatcat:imll7sr4f5hnhfr7lfji6r2sau

Measuring the non-asymptotic convergence of sequential Monte Carlo samplers using probabilistic programming [article]

Marco F. Cusumano-Towner, Vikash K. Mansinghka
2017 arXiv   pre-print
A key limitation of sampling algorithms for approximate inference is that it is difficult to quantify their approximation error. Widely used sampling schemes, such as sequential importance sampling with resampling and Metropolis-Hastings, produce output samples drawn from a distribution that may be far from the target posterior distribution. This paper shows how to upper-bound the symmetric KL divergence between the output distribution of a broad class of sequential Monte Carlo (SMC) samplers
more » ... d their target posterior distributions, subject to assumptions about the accuracy of a separate gold-standard sampler. The proposed method applies to samplers that combine multiple particles, multinomial resampling, and rejuvenation kernels. The experiments show the technique being used to estimate bounds on the divergence of SMC samplers for posterior inference in a Bayesian linear regression model and a Dirichlet process mixture model.
arXiv:1612.02161v2 fatcat:s2xz22zlunczrc35fvrn64e5pq

Gen: a general-purpose probabilistic programming system with programmable inference

Marco F. Cusumano-Towner, Feras A. Saad, Alexander K. Lew, Vikash K. Mansinghka
2019 Proceedings of the 40th ACM SIGPLAN Conference on Programming Language Design and Implementation - PLDI 2019  
The output function f : X × {s : p(s; x ) > 0} → Y maps an argument x and an assignment s of all random choices (whose probability under p is non-zero) to a return value f (x, s) ∈ Y.  ...  We define Generative function A generative function G is a tuple G = (X, Y, f , p, q).  ... 
doi:10.1145/3314221.3314642 dblp:conf/pldi/Cusumano-Towner19 fatcat:bmjwmh7jhjf33gg6dywrdjop7y

Trace types and denotational semantics for sound programmable inference in probabilistic languages

Alexander K. Lew, Marco F. Cusumano-Towner, Benjamin Sherman, Michael Carbin, Vikash K. Mansinghka
2019 Proceedings of the ACM on Programming Languages (PACMPL)  
Another approach would be to develop dynamic analyses of the KL divergence between an inference program and its target distribution, based on recently introduced Monte Carlo estimators [Cusumano-Towner  ...  , and the addresses modified by MCMC kernels, could also be used to improve the performance of inference, by compiling trace data structures specialized for a particular inference algorithm's needs [Cusumano-Towner  ... 
doi:10.1145/3371087 fatcat:kf4vfvzt7rdvzencdtksbk3moq

Bayesian synthesis of probabilistic programs for automatic data modeling

Feras A. Saad, Marco F. Cusumano-Towner, Ulrich Schaechtle, Martin C. Rinard, Vikash K. Mansinghka
2019 Proceedings of the ACM on Programming Languages (PACMPL)  
f (x ) : x ∈ X .  ...  Give time points {x 1 , . . . , x n } the vector of random variables [f (x 1 ), . . . , f (x n )] is jointly Gaussian with mean vector [m(x 1 ), . . . , m(x n )] and covariance matrix [k (x i , x j )]  ... 
doi:10.1145/3290350 fatcat:xemazron3rg65nvmab2rdcgyei

Towards Denotational Semantics of AD for Higher-Order, Recursive, Probabilistic Languages [article]

Alexander K. Lew, Mathieu Huot, Vikash K. Mansinghka
2021 arXiv   pre-print
Marco F Cusumano-Towner, Feras A Saad, Alexander K Lew, and Vikash K Mansinghka. Gen: a general-purpose probabilistic programming system with programmable inference.  ...  Marco Cusumano-Towner, Alexander K Lew, and Vikash K Mansinghka. Automating involutive mcmc using probabilistic and differentiable programming. arXiv preprint arXiv:2007.09871, 2020.  ... 
arXiv:2111.15456v2 fatcat:3nhzpfr5u5cstonw7slwrz7co4

Sampling Prediction-Matching Examples in Neural Networks: A Probabilistic Programming Approach [article]

Serena Booth, Ankit Shah, Yilun Zhou, Julie Shah
2020 arXiv   pre-print
Acknowledgements We thank Alex Lew, Marco Cusumano-Towner, Christian Muise, and Hendrik Strobelt for fruitful discussions.  ...  In our work, we use the state-of-the-art language Gen (Cusumano-Towner et al., 2019).  ...  Our technique empirically evaluates prediction level sets, or sets of a given confidence, through applying probabilistic programming (Cusumano-Towner et al., 2019) .  ... 
arXiv:2001.03076v1 fatcat:dfa6ibvkefcxtem3xjcgysaany

Learning Proposals for Probabilistic Programs with Inference Combinators [article]

Sam Stites, Heiko Zimmermann, Hao Wu, Eli Sennesh, Jan-Willem van de Meent
2021 arXiv   pre-print
Cusumano-Towner, Marco F., Saad, Feras A., Lew, Alexan- der K., Mansinghka, Vikash K. "Gen: A General-Purpose Probabilistic Programming System with Programmable Inference."  ...  et al., 2018) , Gen (Cusumano-Towner et al., 2019) , Pyro , and Edward2 (Tran et al., 2016) 2 .  ...  In the case where the set of random variables in the proposal and target program is the same, i.e. dom(τ 2 ) = dom(τ 1 ) and q 1 = f is a primitive program, we can write w 1 = γ f (τ 1 ; c 0 , φ)/p f (  ... 
arXiv:2103.00668v3 fatcat:wc5yn2njabbzpeayykqk55cymu

Bayes-TrEx: a Bayesian Sampling Approach to Model Transparency by Example

Serena Booth, Yilun Zhou, Ankit Shah, Julie Shah
2021 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Acknowledgements The authors would like to thank: Alex Lew, Marco Cusumano-Towner, and Tan Zhi-Xuan for their insights on how to formulate this inference problem and use probabilistic programming effectively  ...  State-of-the-art approaches require a gold standard inference algorithm (Cusumano-Towner and Mansinghka 2017) or specific posterior distribution properties, such as log-concavity (Gorham and Mackey 2015  ...  We have u 1 |x ∼ N |f (x) i − f (x) j |, σ 2 1 , ( 7 ) u 2 |x ∼ N (min(f (x) i , f(x) j ) − max k =i,j f (x) k , σ 2 2 ), (8) u * 1 = 0, u * 2 = 0.5. ( 9 ) σ 1 and σ 2 are hyperparameters.  ... 
doi:10.1609/aaai.v35i13.17361 fatcat:dh6pe76ob5dzrcfswhlu6ojskq

Causal Inference using Gaussian Processes with Structured Latent Confounders [article]

Sam Witty, Kenta Takatsu, David Jensen, Vikash Mansinghka
2020 arXiv   pre-print
Acknowledgments Thanks to Marco Cusumano-Towner, Feras Saad, Alex Lew, Cameron Freer, Rachel Paiste, Amanda Gentzel, Andy Zane, Jameson Quinn, and the anonymous reviewers for their helpful feedback and  ...  We implement the GP-SLC model using Gen (Cusumano-Towner et al., 2019), a probabilistic programming language with programmable inference.  ...  We implement both models in Gen (Cusumano-Towner et al., 2019) . For both models, we use α σy = 4.0, β σy = 4.0, µ (·) = 0, σ 2 α = 3.0, σ 2 β = 1.0, and σ 2 η = 10.0 as priors.  ... 
arXiv:2007.07127v1 fatcat:ozv4qpjgbzairaf6abidimuzum
« Previous Showing results 1 — 15 out of 30 results