A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Manifold Regularization for Locally Stable Deep Neural Networks
[article]
2020
arXiv
pre-print
We apply concepts from manifold regularization to develop new regularization techniques for training locally stable deep neural networks. Our regularizers are based on a sparsification of the graph Laplacian which holds with high probability when the data is sparse in high dimensions, as is common in deep learning. Empirically, our networks exhibit stability in a diverse set of perturbation models, including ℓ_2, ℓ_∞, and Wasserstein-based perturbations; in particular, we achieve 40 against an
arXiv:2003.04286v2
fatcat:7v5tuul45vgy7jiljtannwg6um