Filters








9,162 Hits in 2.8 sec

Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning [article]

Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, J. Ignacio Cirac
2019 arXiv   pre-print
Inspired by these developments, and the natural correspondence between tensor networks and probabilistic graphical models, we provide a rigorous analysis of the expressive power of various tensor-network  ...  Additionally, we introduce locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems, with provably better expressive power than all other representations  ...  Acknowledgments We would like to thank Vedran Dunjko for his comments on the manuscript and João Gouveia for his suggestion of the proof of Lemma 9 in the supplementary material. I. G., N. P. and J.  ... 
arXiv:1907.03741v2 fatcat:cwryxzb3xnb7bf7ev7zjdrhzaa

Probabilistic Neural Programs [article]

Kenton W. Murray, Jayant Krishnamurthy
2016 arXiv   pre-print
We present probabilistic neural programs, a framework for program induction that permits flexible specification of both a computational model and inference algorithm while simultaneously enabling the use  ...  of deep neural networks.  ...  Acknowledgements The authors would like to thank the reviewers for their comments as well as helpful discussions with Arturo Argueta and Oyvind Tafjord.  ... 
arXiv:1612.00712v1 fatcat:mffogftuwzg47mqo3imqvdnnrm

Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework [article]

Jacob Miller and Geoffrey Roeder and Tai-Danae Bradley
2021 arXiv   pre-print
We investigate a correspondence between two formalisms for discrete probabilistic modeling: probabilistic graphical models (PGMs) and tensor networks (TNs), a powerful modeling framework for simulating  ...  This method allows a broad family of probabilistic TN models to be encoded as partially decohered BMs, a fact we leverage to combine the representational strengths of both model families.  ...  Acknowledgments and Disclosure of Funding The authors thank Guillaume Verdon, Antonio Martinez, and Stefan Leichenauer for helpful discussions, and Jae Hyeon Yoo for engineering support.  ... 
arXiv:2106.15666v1 fatcat:zamkpurkp5hfbk2ogd6pkwtrlu

Tensor Decompositions in Deep Learning [article]

Davide Bacciu, Danilo P. Mandic
2020 arXiv   pre-print
After a brief review of consolidated works on multi-way data analysis, we consider the use of tensor decompositions in compressing the parameter space of deep learning models.  ...  The paper surveys the topic of tensor decompositions in modern machine learning applications. It focuses on three active research topics of significant relevance for the community.  ...  Further, they put forward a methodological framework that allows assessing the expressive power of the compressed neural models.  ... 
arXiv:2002.11835v1 fatcat:izu4qtizqbghhnaxlhisi2tnce

A quantum machine learning algorithm based on generative models

X. Gao, Z.-Y. Zhang, L.-M. Duan
2018 Science Advances  
A significant school of thought regarding artificial intelligence is based on generative models. Here, we propose a general quantum algorithm for machine learning based on a quantum generative model.  ...  We prove that our proposed model is more capable of representing probability distributions compared with classical generative models and has exponential speedup in learning and inference at least for some  ...  The representational power is also closely related to the so-called generalization ability of a probabilistic model (see section S2).  ... 
doi:10.1126/sciadv.aat9004 fatcat:ad3v6ytw45cyrd6z3vj62gpsna

A semi-tensor product approach for Probabilistic Boolean Networks

Xiaoqing Cheng, Yushan Qiu, Wenpin Hou, Wai-Ki Ching
2014 2014 8th International Conference on Systems Biology (ISB)  
And we shall discuss three important problems in Probabilistic Boolean Networks (PBNs): the dynamic of a PBN, the steady-state probability distribution and the inverse problem.  ...  Modeling genetic regulatory networks is an important issue in systems biology. Various models and mathematical formalisms have been proposed in the literature to solve the capture problem.  ...  ACKNOWLEDGMENT The authors would like to thank the referees and the editor for their helpful comments and suggestions.  ... 
doi:10.1109/isb.2014.6990737 fatcat:c72xay5i2za5xbydshr2f7qpyq

Factorized Variational Autoencoders for Modeling Audience Reactions to Movies

Zhiwei Deng, Rajitha Navarathna, Peter Carr, Stephan Mandt, Yisong Yue, Iain Matthews, Greg Mori
2017 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
Matrix and tensor factorization methods are often used for finding underlying low-dimensional patterns from noisy data.  ...  We apply our approach to a large dataset of facial expressions of moviewatching audiences (over 16 million faces).  ...  When F is a single matrix, it corresponds to model 1 (linear tensor factorization). When F is a multi-layer neural network, it corresponds to model 2 (nonlinear tensor factorization).  ... 
doi:10.1109/cvpr.2017.637 dblp:conf/cvpr/DengN0MYMM17 fatcat:cje34hrjwnawlk3jhhmevunfjm

An efficient quantum algorithm for generative machine learning [article]

Xun Gao, Zhengyu Zhang, Luming Duan
2017 arXiv   pre-print
We prove that our proposed model is exponentially more powerful to represent probability distributions compared with classical generative models and has exponential speedup in training and inference at  ...  Here, we propose an efficient quantum algorithm for machine learning based on a quantum generative model.  ...  Acknowledgements This work was supported by the Ministry of Education and the National key Research and Development Program of China.  ... 
arXiv:1711.02038v1 fatcat:nxcusddmbzfnrck3sizi4dkhta

From Probabilistic Graphical Models to Generalized Tensor Networks for Supervised Learning

Ivan Glasser, Nicola Pancotti, J. Ignacio Cirac.
2020 IEEE Access  
In this work we explore the connection between tensor networks and probabilistic graphical models, and show that it motivates the definition of generalized tensor networks where information from a tensor  ...  We benchmark our algorithm for several generalized tensor network architectures on the task of classifying images and sounds, and show that they outperform previously introduced tensor-network algorithms  ...  ACKNOWLEDGMENT We would like to thank Shi-Ju Ran, Yoav Levine, and Miles Stoudenmire for discussions, and Vedran Dunjko for a careful reading of the paper and helpful comments.  ... 
doi:10.1109/access.2020.2986279 fatcat:b22uwdtlwjgujdgwmeue2mqrt4

From probabilistic graphical models to generalized tensor networks for supervised learning [article]

Ivan Glasser, Nicola Pancotti, J. Ignacio Cirac
2019 arXiv   pre-print
In this work we explore the connection between tensor networks and probabilistic graphical models, and show that it motivates the definition of generalized tensor networks where information from a tensor  ...  We benchmark our algorithm for several generalized tensor network architectures on the task of classifying images and sounds, and show that they outperform previously introduced tensor-network algorithms  ...  ACKNOWLEDGMENT We would like to thank Shi-ju Ran, Yoav Levine and Miles Stoudenmire for discussions, as well as Vedran Dunjko for a careful reading of the paper and helpful comments.  ... 
arXiv:1806.05964v2 fatcat:b62plq223rappb5p6fll76o6li

Streaming Probabilistic Deep Tensor Factorization [article]

Shikai Fang, Zheng Wang, Zhimeng Pan, Ji Liu, Shandian Zhe
2020 arXiv   pre-print
Despite the success of existing tensor factorization methods, most of them conduct a multilinear decomposition, and rarely exploit powerful modeling frameworks, like deep neural networks, to capture a  ...  To address these issues, we propose SPIDER, a Streaming ProbabilistIc Deep tEnsoR factorization method. We first use Bayesian neural networks (NNs) to construct a deep tensor factorization model.  ...  Enlightened by the expressive power of (deep) neural networks (Goodfellow et al., 2016) , we propose a Bayesian deep tensor factorization model to overcome the limitation of traditional methods and flexibly  ... 
arXiv:2007.07367v1 fatcat:7cernqt3onbybaf2wn7zfxt7uu

Introduction to the Special Issue on Tensor Decomposition for Signal Processing and Machine Learning

Hongyang Chen, Sergiy A. Vorobyov, Hing Cheung So, Fauzia Ahmad, Fatih Porikli
2021 IEEE Journal on Selected Topics in Signal Processing  
BAM covers various probabilistic nonnegative tensor factorization (NTF) and topic models under one general framework.  ...  These tools aid in learning a variety of models, including community models, probabilistic context-free-grammars, Gaussian mixture model, and two-layer neural networks.  ... 
doi:10.1109/jstsp.2021.3065184 fatcat:qbvihejwkfaa5hoztety77pnwi

Compiling Stan to Generative Probabilistic Languages and Extension to Deep Probabilistic Programming [article]

Guillaume Baudart, Javier Burroni, Martin Hirzel, Louis Mandel, Avraham Shinnar
2021 arXiv   pre-print
Stan is a probabilistic programming language that is popular in the statistics community, with a high-level syntax for expressing probabilistic models.  ...  Building on Pyro we extend Stan with support for explicit variational inference guides and deep probabilistic models.  ...  Adding neural networks One of the main advantages of Pyro is its tight integration with PyTorch which allows the authoring of deep probabilistic models, that is, probabilistic models involving neural networks  ... 
arXiv:1810.00873v5 fatcat:3lcvh6vr6rbxhaszuuuvjpsjuu

Probabilistic Multilayer Networks [article]

Enrique Hernández-Lemus, Jesús Espinal-Enríquez, Guillermo de Anda-Jáuregui
2021 arXiv   pre-print
Here we introduce probabilistic weighted and unweighted multilayer networks as derived from information theoretical correlation measures on large multidimensional datasets.  ...  We present the fundamentals of the formal application of probabilistic inference on problems embedded in multilayered environments, providing examples taken from the analysis of biological and social systems  ...  Clique factorization (closely related to the associated Gibbs measure of the MRF) is a powerful way to factor the full JPD.  ... 
arXiv:1808.07857v2 fatcat:2eqcidofhzbahjvs4u4i6hmuoi

Functional Tensors for Probabilistic Programming [article]

Fritz Obermeyer, Eli Bingham, Martin Jankowiak, Du Phan, Jonathan P. Chen
2020 arXiv   pre-print
Moreover, functional tensors are a natural candidate for generalized variable elimination and parallel-scan filtering algorithms that enable parallel exact inference for a large family of tractable modeling  ...  We demonstrate the versatility of functional tensors by integrating them into the modeling frontend and inference backend of the Pyro programming language.  ...  Funsors fill two roles in probabilistic programming: as representa-8 See Appendix B.2 for details. tions of lazy tensor expressions in user-facing model code generated by nonstandard interpretation of  ... 
arXiv:1910.10775v2 fatcat:lotryn2vtzcyzg45derogdr4b4
« Previous Showing results 1 — 15 out of 9,162 results