Filters








10,164 Hits in 4.0 sec

Neural Spline Flows [article]

Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios
2019 arXiv   pre-print
We demonstrate that neural spline flows improve density estimation, variational inference, and generative modeling of images.  ...  A normalizing flow models a complex probability density as an invertible transformation of a simple base density.  ...  Neural ordinary differential equations [Neural ODEs, 3] define an additional ODE which describes the trajectory of the flow's gradient, avoiding the need to backpropagate through an ODE solver.  ... 
arXiv:1906.04032v2 fatcat:dw56ywfo2nao7frnmlfgo366yq

Block Neural Autoregressive Flow [article]

Nicola De Cao, Ivan Titov, Wilker Aziz
2019 arXiv   pre-print
Recently, as an alternative to hand-crafted bijections, Huang et al. (2018) proposed neural autoregressive flow (NAF) which is a universal approximator for density functions.  ...  Their flow is a neural network (NN) whose parameters are predicted by another NN.  ...  Acknowledgements We would like to thank George Papamakarios and Luca Falorsi for insightful discussions.  ... 
arXiv:1904.04676v1 fatcat:ex3gsvgi6rfpbi2w4skzqboczy

Neural Canonical Transformation with Symplectic Flows [article]

Shuo-Hui Li, Chen-Xiao Dong, Linfeng Zhang, Lei Wang
2019 arXiv   pre-print
We present an efficient implementation of symplectic neural coordinate transformations and two ways to train the model.  ...  Correspondingly, the phase space density of the physical system flows towards a factorized Gaussian distribution in the latent space.  ...  One of the key problems for the flow models is to design flexible bijective neural networks whose Jacobian determinants are efficiently computable.  ... 
arXiv:1910.00024v2 fatcat:i2ofcghlq5h4bonaxlpcsfat2q

Neural Canonical Transformation with Symplectic Flows

Shuo-Hui Li, Chen-Xiao Dong, Linfeng Zhang, Lei Wang
2020 Physical Review X  
We present an efficient implementation of symplectic neural coordinate transformations and two ways to train the model based either on the Hamiltonian function or phase-space samples.  ...  Intriguingly, it has a natural correspondence to normalizing flows with a symplectic constraint.  ...  Since the symplectic symmetry is crucial for the canonical transformation, it is crucial to employ symplectic integrators [3] in the neural ODE implementation.  ... 
doi:10.1103/physrevx.10.021020 fatcat:be4javjczfg3pjj33f42c3l7u4

Beltrami Flow and Neural Diffusion on Graphs [article]

Benjamin Paul Chamberlain, James Rowbottom, Davide Eynard, Francesco Di Giovanni, Xiaowen Dong, Michael M Bronstein
2021 arXiv   pre-print
We propose a novel class of graph neural networks based on the discretised Beltrami flow, a non-Euclidean diffusion PDE.  ...  The resulting model generalises many popular graph neural networks and achieves state-of-the-art results on several benchmarks.  ...  In addition there are several other works that apply the neural ODE framework to graphs.  ... 
arXiv:2110.09443v1 fatcat:jwytc2ywgnhijo5ijdiwerhawa

Information Flow in Deep Neural Networks [article]

Ravid Shwartz-Ziv
2022 arXiv   pre-print
We later discuss one of the most challenging problems of applying the IB to deep neural networks - estimating mutual information.  ...  Recent theoretical developments, such as the neural tangent kernel (NTK) framework, are used to investigate generalization signals.  ...  Acknowledgments First and foremost, I am extremely grateful to my supervisor and mentor, Professor Naftali Tishby, who passed away recently.  ... 
arXiv:2202.06749v2 fatcat:eo3pcousavg3zp5xza57kejjq4

Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows [article]

George Papamakarios, David C. Sterratt, Iain Murray
2019 arXiv   pre-print
SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density.  ...  We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible.  ...  The comparison with them is meant to demonstrate the gap between state-of-the-art neural-based methods and off-the-shelf, commonly-used alternatives.  ... 
arXiv:1805.07226v2 fatcat:n5vhctbtrzeg3amutvprom3r2i

Energetic Variational Neural Network Discretizations to Gradient Flows [article]

Ziqing Hu, Chun Liu, Yiwei Wang, Zhiliang Xu
2022 arXiv   pre-print
The neural-network-based spatial discretization enables us to solve these gradient flows in high dimensions.  ...  We propose structure-preserving neural-network-based numerical schemes to solve both L^2-gradient flows and generalized diffusions.  ...  One way to construct a neural network R d → R d to approximate a flow map is to use the planar flow. A K-layer planar flow is defined by T = T K • . . .  ... 
arXiv:2206.07303v1 fatcat:anqcpz2mnfdh7cc74yhjleojka

An artificial neural network framework for reduced order modeling of transient flows [article]

Omer San, Romit Maulik, Mansoor Ahmed
2019 arXiv   pre-print
Our approach utilizes a training process from full-order scale direct numerical simulation data projected on proper orthogonal decomposition (POD) modes to achieve an artificial neural network (ANN) model  ...  Our results show that the proposed framework provides a non-intrusive alternative to the evolution of transient physics in a POD basis spanned space, and can be used as a robust predictive model order  ...  General machine learning based methods (of which ANN is a subset) are also growing in popularity for fluid flow based applications and represent a computationally viable alternative to the full Navier-Stokes  ... 
arXiv:1802.09474v2 fatcat:x6hvga4mwnfmlf3c4qtuucppvi

A Level Set Theory for Neural Implicit Evolution under Explicit Flows [article]

Ishit Mehta, Manmohan Chandraker, Ravi Ramamoorthi
2022 arXiv   pre-print
Coordinate-based neural networks parameterizing implicit surfaces have emerged as efficient representations of geometry.  ...  Our method uses the flow field to deform parametric implicit surfaces by extending the classical theory of level sets.  ...  to a neural implicit, 2) A mesh-based algorithm is used to derive a flow field on the explicit surface ( § 4.1), and 3) A corresponding Eulerian flow field is used to evolve the implicit geometry ( §  ... 
arXiv:2204.07159v2 fatcat:bubfu3vwpnf55no34cookkxt3m

Predictions of turbulent shear flows using deep neural networks [article]

P. A. Srinivasan, L. Guastoni, H. Azizpour, P. Schlatter, R. Vinuesa
2019 arXiv   pre-print
In the present work we assess the capabilities of neural networks to predict temporally evolving turbulent flows. In particular, we use the nine-equation shear flow model by Moehlis et al. [New J.  ...  Phys. 6, 56 (2004)] to generate training data for two types of neural networks: the multilayer perceptron (MLP) and the long short-term memory (LSTM) network.  ...  Consequently, we consider that the computational cost of evaluating the neural network is sufficiently low to constitute an efficient alternative for predicting instantaneous variables as e.g. in SGS models  ... 
arXiv:1905.03634v1 fatcat:5pyrc2o6xvdelg26egbroyzc3a

Dissecting Neural ODEs [article]

Stefano Massaroli, Michael Poli, Jinkyoo Park, Atsushi Yamashita, Hajime Asama
2021 arXiv   pre-print
Continuous deep learning architectures have recently re-emerged as Neural Ordinary Differential Equations (Neural ODEs).  ...  Higher-order Neural ODEs Further parameter efficiency can be achieved by lifting the Neural ODEs to higher orders.  ...  In this section we discuss alternative augmentation strategies for Neural ODEs that match or improve on 0-augmentation in terms of performance or parameter efficiency.  ... 
arXiv:2002.08071v4 fatcat:3y7ppq3ygzalll5vwuebgnydpa

Artificial Neural Networks for Forecasting Passenger Flows on Metro Lines

Mariano Gallo, Giuseppina De Luca, Luca D'Acierno, Marilisa Botte
2019 Sensors  
Numerical results show that the proposed approach is able to forecast the flows on metro sections with satisfactory precision.  ...  In this paper, we propose the use of Artificial Neural Networks (ANNs) for forecasting metro onboard passenger flows as a function of passenger counts at station turnstiles.  ...  Acknowledgments: The authors are grateful to the anonymous reviewers for their valuable comments and suggestions. Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s19153424 fatcat:lu6hpdvkkjcp5nqv4ohyn2btqq

CAINNFlow: Convolutional block Attention modules and Invertible Neural Networks Flow for anomaly detection and localization tasks [article]

Ruiqing Yan, Fan Zhang, Mengyuan Huang and Wu Liu and Dongyu Hu and Jinfeng Li, Qiang Liu and Jingrong Jiang and Qianjin Guo and Linghan Zheng
2022 arXiv   pre-print
to retain and effectively extract spatial structure information in the normalized flow model.  ...  In order to retain and effectively extract spatial structure information, we design in this study a complex function model with alternating CBAM embedded in a stacked 3×3 full convolution, which is able  ...  industrial production efficiency.  ... 
arXiv:2206.01992v4 fatcat:s7xli3ksvjhnzc3bj3ce7x3u5u

On the use of recurrent neural networks for predictions of turbulent flows [article]

Luca Guastoni, Prem A. Srinivasan, Hossein Azizpour, Philipp Schlatter, Ricardo Vinuesa
2020 arXiv   pre-print
Furthermore, more sophisticated loss functions, including not only the instantaneous predictions but also the averaged behavior of the flow, may lead to much faster neural network training.  ...  We also observe that using a loss function based only on the instantaneous predictions of the flow may not lead to the best predictions in terms of turbulence statistics, and it is necessary to define  ...  Next we explore different strategies to potentially improve the accuracy and efficiency of RNN predictions.  ... 
arXiv:2002.01222v1 fatcat:euatpxu64rd5lht7365rxb7n2a
« Previous Showing results 1 — 15 out of 10,164 results