Filters








16,482 Hits in 5.8 sec

Stability Analysis of a Class of Noise Perturbed Neural Networks

Anke Meyer-Bäse
1997 International Journal of Neural Systems  
We establish robustness stability results for a speci c type of arti cial neural networks for associative memories under parameter perturbations and determine conditions that ensure the existence of asymptotically  ...  stable equilibria of the perturbed neural system that are near the asymptotically stable equilibria of the original unperturbed neural network.  ...  In this sense we established robustness stability results for the perturbed neural network model and determined the conditions that ensure the existence of asymptotically stable equilibria of the perturbed  ... 
doi:10.1142/s0129065797000306 fatcat:46vqxdi3avhlfnazr42ob3nwte

Residual networks classify inputs based on their neural transient dynamics [article]

Fereshteh Lagzi
2021 arXiv   pre-print
We compare the network dynamics for a ResNet and a Multi-Layer Perceptron and show that the internal dynamics, and the noise evolution are fundamentally different in these networks, and ResNets are more  ...  We bring analytical and empirical evidence that residual networks classify inputs based on the integration of the transient dynamics of the residuals, and will show how the network responds to input perturbations  ...  for the lack of stability of the neural activities.  ... 
arXiv:2101.03009v2 fatcat:geptfsdgybbengf2i5qx76bnk4

How Does Noise Help Robustness? Explanation and Exploration under the Neural SDE Framework

Xuanqing Liu, Tesi Xiao, Si Si, Qin Cao, Sanjiv Kumar, Cho-Jui Hsieh
2020 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
For regularization purposes, our framework includes multiple types of noise patterns, such as dropout, additive, and multiplicative noise, which are common in plain neural networks.  ...  Some commonly used regularization mechanisms in discrete neural networks (e.g., dropout, Gaussian noise) are missing in current Neural ODE networks.  ...  Based on the formulation, we draw a theoretical connection between the robustness of neural networks to the stability of the solution.  ... 
doi:10.1109/cvpr42600.2020.00036 dblp:conf/cvpr/LiuXSCKH20 fatcat:2kpebczdbbfp7jvfy3gafotdpi

Exponential Stability of Stochastic Delayed Neural Networks with Inverse Hölder Activation Functions and Markovian Jump Parameters

Yingwei Li, Huaiqin Wu
2014 Discrete Dynamics in Nature and Society  
The exponential stability issue for a class of stochastic neural networks (SNNs) with Markovian jump parameters, mixed time delays, andα-inverse Hölder activation functions is investigated.  ...  of LMIs to ensure the SNNs with noise perturbations to be globally exponentially stable in the mean square.  ...  Acknowledgments This work was supported by the National Science and Technology Major Project of China (no. 2011ZX05020-006) and Natural Science Foundation of Hebei Province of China (A2011203103).  ... 
doi:10.1155/2014/784107 fatcat:wahkkzjvo5awhertupiwtz3vgi

Almost Sure Stability of Stochastic Neural Networks with Time Delays in the Leakage Terms

Mingzhu Song, Quanxin Zhu, Hongwei Zhou
2016 Discrete Dynamics in Nature and Society  
The stability issue is investigated for a class of stochastic neural networks with time delays in the leakage terms.  ...  of stochastic neural networks.  ...  Concluding Remarks In this paper, we have investigated the almost sure stability analysis problem for a class of stochastic neural networks with time delays in the leakage terms.  ... 
doi:10.1155/2016/2487957 fatcat:3vdy336edbgmvnx2t5mejbsygq

Achieving stable dynamics in neural circuits [article]

Leo Kozachkov, Mikael Lundqvist, Jean-Jacques Slotine, Earl K. Miller
2020 bioRxiv   pre-print
We approached this problem from a control-theory perspective by applying contraction analysis to recurrent neural networks.  ...  There are multiple sources of noise and variation yet activity has to eventually converge to a stable, reproducible state (or sequence of states) for its computations to make sense.  ...  Acknowledgments We thank Pawel Herman for comments on an earlier version of this manuscript. We thank Michael Happ and all members of the Miller Lab for helpful discussions and suggestions.  ... 
doi:10.1101/2020.01.17.910174 fatcat:s7nqxqivkjbctpereug6ah5oly

A Differentially Private Framework for Deep Learning with Convexified Loss Functions [article]

Zhigang Lu, Hassan Jameel Asghar, Mohamed Ali Kaafar, Darren Webb, Peter Dickinson
2022 arXiv   pre-print
(via the exponential mechanism) at the output layer of a baseline non-private neural network trained with a convexified loss function.  ...  Under a black-box setting, based on this global sensitivity, to control the overall noise injection, we propose a novel output perturbation framework by injecting DP noise into a randomly sampled neuron  ...  ACKNOWLEDGEMENTS We thank Bargav Jayaraman for clarifications on the use of their implementation of DP-SGD.  ... 
arXiv:2204.01049v1 fatcat:ogtzsv5egfesnmjzvndewxhg7e

Coloring black boxes: visualization of neural network decisions [article]

Wlodzislaw Duch
2018 arXiv   pre-print
and optimization procedures, investigate stability of network classification under perturbation of original vectors, and place new data sample in relation to training data, allowing for estimation of  ...  confidence in classification of a given sample.  ...  Bottom row: comparison of RBF with MLP solutions for inputs perturbed by a strong noise (15%) .  ... 
arXiv:1802.08478v1 fatcat:56y7wk4hyjgihlvmcnsg43tvg4

Neural Population Geometry Reveals the Role of Stochasticity in Robust Perception [article]

Joel Dapello, Jenelle Feather, Hang Le, Tiago Marques, David D. Cox, Josh H. McDermott, James J. DiCarlo, SueYeon Chung
2021 arXiv   pre-print
Geometric analysis of the stochastic networks reveals overlap between representations of clean and adversarially perturbed stimuli, and quantitatively demonstrates that competing geometric effects of stochasticity  ...  Recent work has proposed adding biologically-inspired components to visual neural networks as a way to improve their adversarial robustness.  ...  Grant (S.C., H.L.), a Friends of the McGovern Institute Fellowship (J.J.F.), a DOE CSGF Fellowship (J.J.F.), the PhRMA Foundation Postdoctoral Fellowship in Informatics (T.M), the Semiconductor Research  ... 
arXiv:2111.06979v1 fatcat:reldoes3vjdetcttxgwrpmmhau

Robust Neural Network Tracking Controller Using Simultaneous Perturbation Stochastic Approximation

Qing Song, J.C. Spall, Yeng Chai Soh, Jie Ni
2008 IEEE Transactions on Neural Networks  
The proposed neural control system guarantees closed-loop stability of the estimation system, and a good tracking performance.  ...  We introduce the conic sector theory to establish a robust neural control system, with guaranteed boundedness for both the input/output (I/O) signals and the weights of the neural network.  ...  Unlike the robust conic sector analysis for a pretrained neural network [16] , we provide an online scheme for the robustness analysis of the neural control system.  ... 
doi:10.1109/tnn.2007.912315 pmid:18467211 fatcat:6lhrbegek5fmtj52u5q35kg7d4

Stability Analysis of Stochastic Markovian Jump Neural Networks with Different Time Scales and Randomly Occurred Nonlinearities Based on Delay-Partitioning Projection Approach

Jianmin Duan, Manfeng Hu, Yongqing Yang, Liuxiao Guo
2013 Abstract and Applied Analysis  
In this paper, the mean square asymptotic stability of stochastic Markovian jump neural networks with different time scales and randomly occurred nonlinearities is investigated.  ...  In terms of linear matrix inequality (LMI) approach and delay-partitioning projection technique, delay-dependent stability criteria are derived for the considered neural networks for cases with or without  ...  In real life, neural networks have a phenomenon of information latching, and it is recognized that the best way for modeling this class of neural networks is Markovian jump system [16] [17] [18] .  ... 
doi:10.1155/2013/212469 fatcat:qdp5cyznkja6xklvexbnnbcany

Achieving Generalizable Robustness of Deep Neural Networks by Stability Training [chapter]

Jan Laermann, Wojciech Samek, Nils Strodthoff
2019 Lecture Notes in Computer Science  
We study the recently introduced stability training as a general-purpose method to increase the robustness of deep neural networks against input perturbations.  ...  robustness against a broader range of distortion strengths and types unseen during training, a considerably smaller hyperparameter dependence and less potentially negative side effects compared to data  ...  Stability Training Stability training aims to stabilize predictions of a deep neural network in response to small input distortions.  ... 
doi:10.1007/978-3-030-33676-9_25 fatcat:3h7sto7zqbb2bbg4rrietnxvn4

Stability Criteria for Stochastic Recurrent Neural Networks with Two Time-Varying Delays and Impulses

R. Raja, S.Marshal Anthoni
2010 International Journal of Computer Applications  
This paper is concerned with a stability problem for a class of stochastic recurrent impulsive neural networks with both discrete and distributed time-varying delays.  ...  Based on Lyapunov-Krasovskii functional and the linear matrix inequality (LMI) approach, we analyze the global asymptotic stability of impulsive neural networks.  ...  ACKNOWLEDGEMENTS The work of the first author was supported by UGC Rajiv Gandhi National Fellowship and the work of the second author was supported by the CSIR, New Delhi.  ... 
doi:10.5120/514-831 fatcat:26wa6tvj4rglrpjx6yrwweahha

Long-Term Memory Stabilized by Noise-Induced Rehearsal

Y. Wei, A. A. Koulakov
2014 Journal of Neuroscience  
We show that certain classes of STDP rules can stabilize all stored memory patterns despite a short lifetime of synapses.  ...  Cortical networks can maintain memories for decades despite the short lifetime of synaptic strengths. Can a neural network store long-lasting memories in unstable synapses?  ...  Therefore, the levels of noise have to be sufficiently low for the perturbation theory analysis to be valid. For bistability, we need the value of noise to be larger than a certain threshold.  ... 
doi:10.1523/jneurosci.3929-12.2014 pmid:25411507 pmcid:PMC4236406 fatcat:66lo46lb55aafk7uafjvpsg65m

Long-term memory stabilized by noise-induced rehearsal

Yi Wei, Alexei Koulakov
2013 BMC Neuroscience  
We show that certain classes of STDP rules can stabilize all stored memory patterns despite a short lifetime of synapses.  ...  Cortical networks can maintain memories for decades despite the short lifetime of synaptic strengths. Can a neural network store long-lasting memories in unstable synapses?  ...  Therefore, the levels of noise have to be sufficiently low for the perturbation theory analysis to be valid. For bistability, we need the value of noise to be larger than a certain threshold.  ... 
doi:10.1186/1471-2202-14-s1-p220 pmcid:PMC3704454 fatcat:wfvaegcppfa4bnpt6nah2n6bca
« Previous Showing results 1 — 15 out of 16,482 results