Filters








19,367 Hits in 2.1 sec

Lagrangian Neural Networks [article]

Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, Shirley Ho
2020 arXiv   pre-print
In this paper, we propose Lagrangian Neural Networks (LNNs), which can parameterize arbitrary Lagrangians using neural networks.  ...  Yet even though neural network models see increasing use in the physical sciences, they struggle to learn these symmetries.  ...  Acknowledgments Miles Cranmer would like to thank Jeremy Goodman for several discussions and validation of the theory behind the Lagrangian approach, and Adam Burrows and Oliver Philcox for very helpful  ... 
arXiv:2003.04630v2 fatcat:eude4cua7ngx7bq376fcp7njfq

Lagrangian Decomposition for Neural Network Verification [article]

Rudy Bunel, Alessandro De Palma, Alban Desmaison, Krishnamurthy Dvijotham, Pushmeet Kohli, Philip H.S. Torr, M. Pawan Kumar
2020 arXiv   pre-print
A fundamental component of neural network verification is the computation of bounds on the values their outputs can take.  ...  to forward/backward pass of neural networks layers and are therefore easily parallelizable, amenable to GPU implementation and able to take advantage of the convolutional structure of problems; and (iii  ...  problems for neural network bounds through Lagrangian Decomposition, which in general yields duals at least as strong as those obtained through Lagrangian relaxation (Guignard and Kim, 1987) .  ... 
arXiv:2002.10410v3 fatcat:3h56za34nvcr7iz656r46epwti

The Neural Particle Method – An Updated Lagrangian Physics Informed Neural Network for Computational Fluid Dynamics [article]

Henning Wessels and Christian Weißenfels and Peter Wriggers
2020 arXiv   pre-print
Neural networks are smooth, differentiable functions that can be used as a global ansatz for Partial Differential Equations (PDEs).  ...  It has recently been shown that instead of solving a system of equations as in standard numerical methods, a neural network can be trained solely based on initial and boundary conditions.  ...  When a neural network is used as global spatial ansatz, Raissi et al. [2019] suggested to put a neural network prior on the Runge Kutta stages and the solution.  ... 
arXiv:2003.10208v3 fatcat:c5bo2piurfaeffidsudmrekyea

A Lagrangian Approach to Information Propagation in Graph Neural Networks [article]

Matteo Tiezzi, Giuseppe Marra, Stefano Melacci, Marco Maggini, Marco Gori
2020 arXiv   pre-print
In particular, inspired by the Graph Neural Network (GNN) model, different architectures have been proposed to extend the original GNN scheme.  ...  In fact, the computational structure is based on the search for saddle points of the Lagrangian in the adjoint space composed of weights, neural outputs (node states), and Lagrange multipliers.  ...  A constraint-based formulation of Graph Neural Networks Neural network learning can be cast as a Lagrangian optimization problem by a formulation that requires the minimization of the classical data fitting  ... 
arXiv:2002.07684v3 fatcat:onuvpz6435hvvlpvd43va3ws5e

Lagrangian relaxation neural network for unit commitment

P.B. Luh, Yajun Wang, Xing Zhao
1999 IEEE Power Engineering Society. 1999 Winter Meeting (Cat. No.99CH36233)  
This paper presents a novel method for unit commitment by synergistically combining Lagrangian relaxation for constraint handling with Hopfield-type recurrent neural networks for fast convergence to the  ...  The overall network is proved to be stable, and the difficulties in handling integer variables, subproblem constraints, and subproblem local minima plaguing current neural network methods are avoided.  ...  LAGRANGIAN RELAXATION NEURAL NETWORKS FOR UNIT COMMITMENT Lagrangian Relaxation.  ... 
doi:10.1109/pesw.1999.747504 fatcat:dxsfm2vz6zdrxoboga6i6jeu6a

Deep Lagrangian Constraint-based Propagation in Graph Neural Networks [article]

Matteo Tiezzi, Giuseppe Marra, Stefano Melacci, Marco Maggini
2020 arXiv   pre-print
The popularity of deep learning techniques renewed the interest in neural architectures able to process these patterns, inspired by the Graph Neural Network (GNN) model.  ...  We propose a novel approach to learning in GNNs, based on constrained optimization in the Lagrangian framework.  ...  base neural network module (e.g.  ... 
arXiv:2005.02392v3 fatcat:likvx765tndlphqcwbc5zu4j6q

Lagrangian Density Space-Time Deep Neural Network Topology [article]

Bhupesh Bishnoi
2022 arXiv   pre-print
As a network-based functional approximator, we have proposed a "Lagrangian Density Space-Time Deep Neural Networks" (LDDNN) topology.  ...  This article will discuss statistical physics interpretation of neural networks in the Lagrangian and Hamiltonian domains.  ...  Figure 1 : 1 Figure 1: Lagrangian density neural network Figure 2 : 2 Figure 2: Lagrangian density neural network flowchart  ... 
arXiv:2207.12209v1 fatcat:cdayrwtwnfh4dj4apefifyuoia

Extending Lagrangian and Hamiltonian Neural Networks with Differentiable Contact Models [article]

Yaofeng Desmond Zhong, Biswadip Dey, Amit Chakraborty
2021 arXiv   pre-print
The proposed contact model extends the scope of Lagrangian and Hamiltonian neural networks by allowing simultaneous learning of contact and system properties.  ...  A growing body of work has been exploring ways to enforce energy conservation in the learned dynamics by encoding Lagrangian or Hamiltonian dynamics into the neural network architecture.  ...  Related Work Lagrangian/Hamiltonian-inspired Neural Networks: In the last few years, an increasing volume of work has proposed neural network models to learn the underlying dynamics from data while enforcing  ... 
arXiv:2102.06794v3 fatcat:cazeaqziwjeuhbiq77yl2yysvq

Error-Correcting Neural Networks for Semi-Lagrangian Advection in the Level-Set Method [article]

Luis Ángel Larios-Cárdenas, Frédéric Gibou
2022 arXiv   pre-print
The role of this neural network is to improve the numerically estimated surface trajectory.  ...  The proposed system's starting point is the semi-Lagrangian formulation. And, to reduce numerical dissipation, we introduce an error-quantifying multilayer perceptron.  ...  This idea would be the semi-Lagrangian-scheme equivalence to the curvature neural-network dictionaries introduced in previous studies [45, 46] .  ... 
arXiv:2110.11611v2 fatcat:poqhnevxnbcl7c7fefspmwsjky

Lagrangian Neural Network with Differentiable Symmetries and Relational Inductive Bias [article]

Ravinder Bhattoo, Sayan Ranu, N. M. Anoop Krishnan
2021 arXiv   pre-print
Recent works on Lagrangian and Hamiltonian neural networks show that the underlying symmetries of a system can be easily learned by a neural network when provided with an appropriate inductive bias.  ...  Here, we present a momentum conserving Lagrangian neural network (MCLNN) that learns the Lagrangian of a system, while also preserving the translational and rotational symmetries.  ...  Momentum Conserving Lagrangian Neural Network As mentioned earlier, Lagrangians exhibit translational and rotational symmetry.  ... 
arXiv:2110.03266v2 fatcat:w3iuvdjworhxfhqton6wc4lx7m

Neural Network Training as an Optimal Control Problem: An Augmented Lagrangian Approach [article]

Brecht Evens, Puya Latafat, Andreas Themelis, Johan Suykens, Panagiotis Patrinos
2021 arXiv   pre-print
By applying this framework to the training of neural networks, it is shown that the inner Lagrangian subproblems are amenable to be solved using Gauss-Newton iterations.  ...  Training of neural networks amounts to nonconvex optimization problems that are typically solved by using backpropagation and (variants of) stochastic gradient descent.  ...  Neural Network Training as an Optimal Control Problem — An Augmented Lagrangian  ... 
arXiv:2103.14343v3 fatcat:w7h3jsi5endfpmpcbpargzqkqm

LagNetViP: A Lagrangian Neural Network for Video Prediction [article]

Christine Allen-Blanchette, Sushant Veer, Anirudha Majumdar, Naomi Ehrich Leonard
2020 arXiv   pre-print
To achieve this, we simultaneously learn a low-dimensional state representation and system Lagrangian.  ...  The kinetic and potential energy terms of the Lagrangian are distinctly modelled and the low-dimensional equations of motion are explicitly constructed using the Euler-Lagrange equations.  ...  Figure 1 : Lagrangian neural network for video prediction (LagNetViP) architecture.  ... 
arXiv:2010.12932v1 fatcat:yksdiikl25c25joxttgjwioloq

Lagrangian relaxation neural networks for job shop scheduling

P.B. Luh, Xing Zhao, Yajun Wang, L.S. Thakur
2000 IEEE Transactions on Robotics and Automation  
neural network optimization ideas with Lagrangian relaxation (LR) for constraint handling.  ...  In order to effectively solve such combinatorial optimization problems, this paper presents a novel Lagrangian relaxation neural network (LRNN) for separable optimization problems by combining recurrent  ...  The recent developments on neural networks for constrained optimization include combining Hopfield-type networks optimization ideas with Lagrangian relaxation (LR) or augmented LR for constraint handling  ... 
doi:10.1109/70.833193 fatcat:o4ljmcwidrbxtmamidyw4q6sdm

A Lagrangian Dual-based Theory-guided Deep Neural Network [article]

Miao Rong, Dongxiao Zhang, Nanzhe Wang
2020 arXiv   pre-print
The theory-guided neural network (TgNN) is a kind of method which improves the effectiveness and efficiency of neural network architectures by incorporating scientific knowledge or physical information  ...  Despite its great success, the theory-guided (deep) neural network possesses certain limits when maintaining a tradeoff between training data and domain knowledge during the training process.  ...  As a successful representative, the theory-guided neural network framework, also called a physical-informed neural network framework or an informed deep learning framework, which incorporates the theory  ... 
arXiv:2008.10159v1 fatcat:gfodbi4qcnecjjwea63oiig5dy

A Lagrangian relaxation network for graph matching

A. Rangarajan, E.D. Mjolsness
1996 IEEE Transactions on Neural Networks  
A Lagrangian relaxation network for graph matching is presented.  ...  With the application of a fixpoint preserving algebraic transformation to both the distance measure and self-amplification terms, we obtain a Lagrangian relaxation network.  ...  Neural net approaches (beginnning with the celebrated Hopfield-Tank network [50] ) have typically expressed these constraints via penalty functions.  ... 
doi:10.1109/72.548165 pmid:18263531 fatcat:jomxlw4qyna5rasd45jytcqm3q
« Previous Showing results 1 — 15 out of 19,367 results