Optimization and Abstraction: A Synergistic Approach for Analyzing Neural Network Robustness [article]

Greg Anderson, Shankara Pailoor, Isil Dillig, Swarat Chaudhuri
2019 arXiv   pre-print
In recent years, the notion of local robustness (or robustness for short) has emerged as a desirable property of deep neural networks. Intuitively, robustness means that small perturbations to an input do not cause the network to perform misclassifications. In this paper, we present a novel algorithm for verifying robustness properties of neural networks. Our method synergistically combines gradient-based optimization methods for counterexample search with abstraction-based proof search to
more » ... n a sound and (δ-)complete decision procedure. Our method also employs a data-driven approach to learn a verification policy that guides abstract interpretation during proof search. We have implemented the proposed approach in a tool called Charon and experimentally evaluated it on hundreds of benchmarks. Our experiments show that the proposed approach significantly outperforms three state-of-the-art tools, namely AI^2 , Reluplex, and Reluval.
arXiv:1904.09959v1 fatcat:7khytmrwprfvlppxugkrf3drae