A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Lagrangian Decomposition for Neural Network Verification
[article]
2020
arXiv
pre-print
A fundamental component of neural network verification is the computation of bounds on the values their outputs can take. ...
This results in an overall speed-up when employing the bounds for formal verification. Code for our algorithms is available at https://github.com/oval-group/decomposition-plnn-bounds. ...
Acknowledgments ADP was supported by the EPSRC Centre for Doctoral Training in Autonomous Intelligent Machines and Systems, grant EP/L015987/1. ...
arXiv:2002.10410v3
fatcat:3h56za34nvcr7iz656r46epwti
DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting
[article]
2022
arXiv
pre-print
However, even for reasonably-sized neural networks, these relaxations are not tractable, and so must be replaced by even weaker relaxations in practice. ...
We demonstrate our method in obtaining tighter bounds on the worst-case performance of large convolutional networks in image classification and reinforcement learning settings. ...
Branch and bound for piecewise linear neural network verification. John Duchi, Shai Shalev-Shwartz, Yoram Singer, and Tushar Chandra. ...
arXiv:2106.09117v2
fatcat:wmuclkb6yzdkzf24pz7s7wbrpe
Neural Network Branch-and-Bound for Neural Network Verification
[article]
2021
arXiv
pre-print
Many available formal verification methods have been shown to be instances of a unified Branch-and-Bound (BaB) formulation. ...
Our combined framework achieves a 50\% reduction in both the number of branches and the time required for verification on various convolutional networks when compared to several state-of-the-art verification ...
Improved branch and bound for neural network verification
via lagrangian decomposition. arXiv preprint arXiv:2104.06718, 2021. ...
arXiv:2107.12855v1
fatcat:3hktdiw2vbfndbyec4c7yohvsm
Beta-CROWN: Efficient Bound Propagation with Per-neuron Split Constraints for Complete and Incomplete Neural Network Robustness Verification
[article]
2021
arXiv
pre-print
Bound propagation based incomplete neural network verifiers such as CROWN are very efficient and can significantly accelerate branch-and-bound (BaB) based complete verification of neural networks. ...
By terminating BaB early, our method can also be used for efficient incomplete verification. ...
[11] proposed a better branching strategy, filtered smart branching (FSB), to further improved verification performance of [6] , but the Lagrangian Decomposition based incomplete verifier and the branch ...
arXiv:2103.06624v2
fatcat:ier7kpgjnbgpreabp26ik43eda
GoTube: Scalable Stochastic Verification of Continuous-Depth Models
[article]
2021
arXiv
pre-print
Compared to advanced reachability analysis tools for time-continuous neural networks, GoTube does not accumulate overapproximation errors between time steps and avoids the infamous wrapping effect inherent ...
GoTube is implemented in JAX and optimized to scale to complex continuous-depth neural network models. ...
A more scalable approach for rectified linear unit (ReLU) networks (Nair and Hinton 2010) was recently proposed based on Lagrangian decomposition; this approach significantly improves the speed and tightness ...
arXiv:2107.08467v2
fatcat:lird25izyjc6rb6qk3zaqw753y
A Dual Approach to Scalable Verification of Deep Networks
[article]
2018
arXiv
pre-print
outputs (robustness to bounded norm adversarial perturbations, for example). ...
We formulate verification as an optimization problem (seeking to find the largest violation of the specification) and solve a Lagrangian relaxation of the optimization problem to obtain an upper bound ...
ACKNOWLEDGEMENTS The authors would like to thank Brendan O'Donoghue, Csaba Szepesvari, Rudy Bunel, Jonathan Uesato and ...
arXiv:1803.06567v2
fatcat:jxiioie3crfuvchtcki5lcqv7e
Generating Adversarial Examples with Graph Neural Networks
[article]
2021
arXiv
pre-print
Recent years have witnessed the deployment of adversarial attacks to evaluate the robustness of Neural Networks. ...
To alleviate these deficiencies, we propose a novel attack based on a graph neural network (GNN) that takes advantage of the strengths of both approaches; we call it AdvGNN. ...
GNNs have been used in Neural Network Verification to learn the branching strategy in a Branch-and-Bound algorithm [Lu and Kumar, 2020] and to estimate better bounds [Dvijotham et al., 2018 , Gowal ...
arXiv:2105.14644v1
fatcat:fm7pkhpw4nb57kdkuuivbhrsgq
Verification for Machine Learning, Autonomy, and Neural Networks Survey
[article]
2018
arXiv
pre-print
Autonomy in CPS is enabling by recent advances in artificial intelligence (AI) and machine learning (ML) through approaches such as deep neural networks (DNNs), embedded in so-called learning enabled components ...
Recently, the formal methods and formal verification community has developed methods to characterize behaviors in these LECs with eventual goals of formally verifying specifications for LECs, and this ...
Additionally, in the paper, the authors present a novel approach for neural network verification called Branch and Bound Optimization. ...
arXiv:1810.01989v1
fatcat:a5ax66lsxbho3fuxuh55ypnm6m
Table of Contents
2020
IEEE Signal Processing Letters
Cesar 1660 Optimality Verification of Tensor Completion Model via Self-Validation . . . . . . . . . . C. Liu, H. Shan, T. Ma, and B. ...
Model-Aided Deep Neural Network for Source Number Detection .. . . . . . . . . . . . . Y. Yang, F. Gao, C. Qian, and G. ...
Pal 1395 Joint DOD and DOA Estimation in Slow-Time MIMO Radar via PARAFAC Decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...
doi:10.1109/lsp.2020.3040844
fatcat:xpovskhrvfgctk3hhufuvpyyne
Planning and Operations Research (Dagstuhl Seminar 18071)
2018
Dagstuhl Reports
of artificial intelligence where the emphasis was traditionally more on symbolic and logical search techniques for the intelligent selection and sequencing of actions to achieve a set of goals. ...
This report documents the program and the outcomes of Dagstuhl Seminar 18071 "Planning and Operations Research". ...
It seems that Lagrangian decomposition could improve the heuristic estimates. ...
doi:10.4230/dagrep.8.2.26
dblp:journals/dagstuhl-reports/BeckMRH18
fatcat:lavt5jfujfarfmtwrpbbxan2oq
Table of Contents
2020
IEEE Signal Processing Letters
De Maio, and G. Cui 885 An Exploratory Method for Smooth/Transient Decomposition .
C. ...
Li 1315 Progressive Motion Representation Distillation With Two-Branch Networks for Egocentric Activity Recognition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...
Du 1844 Optimality Verification of Tensor Completion Model via Self-Validation . . . . . . . . . . C. Liu, H. Shan, T. Ma, and B. ...
doi:10.1109/lsp.2020.3040840
fatcat:ezrfzwo6tjbkfhohq2tgec4m6y
A Survey of Safety and Trustworthiness of Deep Neural Networks: Verification, Testing, Adversarial Attack and Defence, and Interpretability
[article]
2020
arXiv
pre-print
In the past few years, significant progress has been made on deep neural networks (DNNs) in achieving human-level performance on several long-standing tasks. ...
This survey paper conducts a review of the current research effort into making DNNs safe and trustworthy, by focusing on four aspects: verification, testing, adversarial attack and defence, and interpretability ...
of verification approaches for neural networks. ...
arXiv:1812.08342v5
fatcat:awndtbca4jbi3pcz5y2d4ymoja
A survey of safety and trustworthiness of deep neural networks: Verification, testing, adversarial attack and defence, and interpretability
2020
Computer Science Review
of verification approaches for neural networks. ...
Verification In this section, we review verification techniques for neural networks. ...
doi:10.1016/j.cosrev.2020.100270
fatcat:biji56htvnglfhl7n3jnuelu2i
Methods for off-line/on-line optimization under uncertainty
2018
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
The methods are applicable when: 1) the uncertainty is exogenous; 2) there exists a heuristic for the on-line phase that can be modeled as a parametric convex optimization problem. ...
We instantiate our approaches on two case studies: an energy management system with uncertain renewable generation and load demand, and a vehicle routing problem with uncertain travel times. ...
The approach is extended in [Lombardi and Gualandi, 2016 ] to two-layer networks via a Lagrangian relaxation. ...
doi:10.24963/ijcai.2018/177
dblp:conf/ijcai/Filippo0M18
fatcat:jig77qymz5eqbab2pyuabfonyu
Boosting Combinatorial Problem Modeling with Machine Learning
2018
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
In this survey we focus on the modeling component, whose effectiveness is crucial for solving the problem. ...
The three pillars of constraint satisfaction and optimization problem solving, i.e., modeling, search, and optimization, can exploit ML techniques to boost their accuracy, efficiency and effectiveness. ...
The approach is extended in [Lombardi and Gualandi, 2016 ] to two-layer networks via a Lagrangian relaxation. ...
doi:10.24963/ijcai.2018/772
dblp:conf/ijcai/0001M18
fatcat:zy52jyurtfc5zps7nzbcp7xxpu
« Previous
Showing results 1 — 15 out of 326 results