A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Generalizing Hamiltonian Monte Carlo with Neural Networks
[article]
2018
arXiv
pre-print
We present a general-purpose method to train Markov chain Monte Carlo kernels, parameterized by deep neural networks, that converge and mix quickly to their target distribution. ...
Our method generalizes Hamiltonian Monte Carlo and is trained to maximize expected squared jumped distance, a proxy for mixing speed. ...
Monte Carlo'). ...
arXiv:1711.09268v3
fatcat:e3ls4iags5hshodawirtxkpslq
Bayesian deep neural networks for low-cost neurophysiological markers of Alzheimer's disease severity
[article]
2018
arXiv
pre-print
, i.e., Monte Carlo dropout and Hamiltonian Monte Carlo. ...
Here, we utilize Bayesian neural networks to develop a multivariate predictor for AD severity using a wide range of quantitative EEG (QEEG) markers. ...
Hamiltonian Monte Carlo [7], which is a Markov Chain Monte Carlo technique that utilises Hamiltonian dynamics to explore the parameter space of networks. ...
arXiv:1812.04994v2
fatcat:g4kr766i3jhgjmm2pqqcgohh4u
Self-learning Monte Carlo method with Behler-Parrinello neural networks
[article]
2018
arXiv
pre-print
We propose a general way to construct an effective Hamiltonian in the Self-learning Monte Carlo method (SLMC) method, which speeds up Monte Carlo simulations by training an effective model to propose uncorrelated ...
We construct self-learning continuous-time interaction-expansion quantum Monte Carlo method with BPNNs and apply it to quantum impurity models. ...
Metropolis-Hastings algorithm In the Monte Carlo method, we have to generate the configuration C with the probability distribution w(C). ...
arXiv:1807.04955v2
fatcat:sqgqxsofxnez7a24ps24qkv23a
Neural quantum states for supersymmetric quantum gauge theories
[article]
2021
arXiv
pre-print
We employ a neural quantum state ansatz for the wave function of a supersymmetric matrix model and use a variational quantum Monte Carlo approach to discover the ground state of the system. ...
The wave function of such supersymmetric gauge theories is not known and it is challenging to obtain with traditional techniques. ...
Methods
Variational Quantum Monte Carlo The variational quantum Monte Carlo method consists of three components: a wave function network, a sampler network, and an optimizer. ...
arXiv:2112.05333v1
fatcat:khiu7nazqrba7j7mb6ijyyywsq
Machine learning phases of matter
2017
Nature Physics
sampled with Monte Carlo. ...
We show that this classification occurs within the neural network without knowledge of the Hamiltonian or even the general locality of interactions. ...
Instead, we construct a fully connected feed-forward neural network, implemented with TensorFlow [6] , to perform supervised learning directly on the raw configurations sampled by a Monte Carlo simulation ...
doi:10.1038/nphys4035
fatcat:q7vkd73gwnepldhc4dfpqyhxoi
Accelerating lattice quantum Monte Carlo simulations using artificial neural networks: Application to the Holstein model
2019
Physical review B
Monte Carlo (MC) simulations are essential computational approaches with widespread use throughout all areas of science. ...
We find that both artificial neural networks are capable of learning an unspecified effective model that accurately reproduces the MC configuration weights of the original Hamiltonian and achieve an order ...
We have extended the use of artificial neural networks in self-learning Monte Carlo methods to lattice Monte Carlo simulations. ...
doi:10.1103/physrevb.100.020302
fatcat:tptxfibh3jfjvprtfzzqbqrasy
Sample generation for the spin-fermion model using neural networks
[article]
2022
arXiv
pre-print
The simplicity of the architecture we use in conjunction with the model agnostic form of the neural networks can enable fast sample generation without the need of a researcher's intervention. ...
Quantum Monte-Carlo simulations of hybrid quantum-classical models such as the double exchange Hamiltonian require calculating the density of states of the quantum degrees of freedom at every step. ...
Moreover, approaches to accelerate Monte-Carlo simulations with machine learning have been explored in the self-learning Monte-Carlo method [26] , using restricted Boltzmann machines [27] , deep [28 ...
arXiv:2206.07753v1
fatcat:7a5io2dgbvahvbr765hemsrhxu
Network-Initialized Monte Carlo Based on Generative Neural Networks
[article]
2022
arXiv
pre-print
We design generative neural networks that generate Monte Carlo configurations with complete absence of autocorrelation from which only short Markov chains are needed before making measurements for physical ...
We further propose a network-initialized Monte Carlo scheme based on such neural networks, which provides independent samplings and can accelerate the Monte Carlo simulations by significantly reducing ...
Moreover, we design a network-initialized Monte Carlo (NIMC) scheme with the assistance of such neural networks,
MCMC Simulation Observables Ⅰ which "heals" the initial bias due to the unthermalized ...
arXiv:2106.00712v6
fatcat:vvruhji4gfewdjmikrtpopg6uq
Stochastic Fractional Hamiltonian Monte Carlo
2018
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
The experimental results show that the proposed stochastic fractional Hamiltonian Monte Carlo for training deep neural networks could converge faster than other popular optimization schemes and generalize ...
In this paper, we propose a novel stochastic fractional Hamiltonian Monte Carlo approach which generalizes the Hamiltonian Monte Carlo method within the framework of fractional calculus and L\'evy diffusion ...
name the dynamics described by Eq. ( 12 ) and ( 12 ) as Fractional Hamiltonian Monte Carlo (FHMC) due to its similarity with Hamiltonian Monte Carlo. ...
doi:10.24963/ijcai.2018/419
dblp:conf/ijcai/YeZ18
fatcat:b2s4slwhsveh7aucgplqe5ckvu
Variational Monte Carlo calculations of 𝐀≤ 4 nuclei with an artificial neural-network correlator ansatz
[article]
2021
arXiv
pre-print
We successfully benchmark the ANN wave function against more conventional parametrizations based on two- and three-body Jastrow functions, and virtually-exact Green's function Monte Carlo results. ...
Artificial neural networks (ANNs) have proven to be a flexible tool to approximate quantum many-body states in condensed matter and chemistry problems. ...
In a series of recent works [25] [26] [27] deep neural networks have been further developed to tackle ab-initio chemistry problems within variational Monte Carlo, often resulting in accuracy improvements ...
arXiv:2007.14282v2
fatcat:dxnayb7oqfd37il6vjmumhl3f4
Machine learning many-electron wave functions via backflow transformations
2020
Journal Club for Condensed Matter Physics
Combined with Monte Carlo methods to evaluate the highly dimensional integral over all particle coordinates, variational and quantum Monte Carlo calculations have provided most accurate values of the many-body ...
Can representations based on neural networks reduce this remaining bias similar successful 1 ...
Combined with Monte Carlo methods to evaluate the highly dimensional integral over all particle coordinates, variational and quantum Monte Carlo calculations have provided most accurate values of the many-body ...
doi:10.36471/jccm_may_2020_01
fatcat:wrfr6xvihvhrzcdemurzxhcrly
Extending Machine Learning Classification Capabilities with Histogram Reweighting
[article]
2020
arXiv
pre-print
We propose the use of Monte Carlo histogram reweighting to extrapolate predictions of machine learning methods. ...
By interpreting the output of the neural network as an order parameter, we explore connections with known observables in the system and investigate its scaling behaviour. ...
This enables the elimination of any potential bias in the quantity associated with the finiteness of the Monte Carlo generated sample. ...
arXiv:2004.14341v1
fatcat:4qpocs6dcfgf3gmykf3gb4bz4i
Neural network gradient Hamiltonian Monte Carlo
2019
Computational statistics (Zeitschrift)
Hamiltonian Monte Carlo is a widely used algorithm for sampling from posterior distributions of complex Bayesian models. ...
We present a method to substantially reduce the computation burden by using a neural network to approximate the gradient. ...
Background
Hamiltonian Monte Carlo Let x ∼ π(x|q) denote a probabilistic model with p a probability density function and q its corresponding parameter. ...
doi:10.1007/s00180-018-00861-z
pmid:31695242
pmcid:PMC6833949
fatcat:g4t6h7f6rva77hvsyrtjn2xik4
Bayesian Optimization with Robust Bayesian Neural Networks
2016
Neural Information Processing Systems
We obtain scalability through stochastic gradient Hamiltonian Monte Carlo, whose robustness we improve via a scale adaptation. ...
We present a general approach for using flexible parametric models (neural networks) for Bayesian optimization, staying as close to a truly Bayesian treatment as possible. ...
[20] and results in an algorithm called generalized stochastic gradient Riemann Hamiltonian Monte Carlo (gSGRHMC). ...
dblp:conf/nips/SpringenbergKFH16
fatcat:vmmf3aodjrgc7nqxx7qua6jgfa
Data-Enhanced Variational Monte Carlo for Rydberg Atom Arrays
[article]
2022
arXiv
pre-print
Monte Carlo (VMC). ...
Today, novel groundstate wavefunction ans\"atze like recurrent neural networks (RNNs) can be efficiently trained not only from projective measurement data, but also through Hamiltonian-guided variational ...
Hamiltonian-driven training methods common in variational Monte Carlo (VMC) [32] [33] [34] . ...
arXiv:2203.04988v1
fatcat:lnu6f5q4mvbmdd7q3zv7cvqin4
« Previous
Showing results 1 — 15 out of 4,486 results