Filters








268,938 Hits in 3.2 sec

Existence and stability of square-mean almost periodic solutions to a spatially extended neural network with impulsive noise

Stefano Bonaccorsi, Giacomo Ziglio
2014 Random Operators and Stochastic Equations  
Our work is concerned with a neural network with n nodes, where the activity of the k-th cell depends on external, stochastic inputs as well as the coupling generated by the activity of the adjacent cells  ...  , transmitted through a diffusion process in the network.  ...  Our model is related to neural networks [12] and cellular neural networks [6, 7] , although their dynamics is usually modelled through a finite dimensional system (possibly using some delay to model  ... 
doi:10.1515/rose-2014-0002 fatcat:62qslcpbuzdghg5vi4w3qnjf6i

Characteristics of Dynamical Phase Transitions for Noise Intensities

Muyoung Heo, Jong-Kil Park, Kyungsik Kim
2014 Procedia Computer Science  
We simulate and analyze dynamical phase transitions in a Boolean neural network with initial random connections.  ...  The nature of the phase transition are found numerically and analytically in two connections of probability density function and one random network.  ...  in a Boolean neural network with initial random connections.  ... 
doi:10.1016/j.procs.2014.05.235 fatcat:uqj5x3dc2jdt3kfehsgr27m4we

Compressing Deep Neural Networks: A New Hashing Pipeline Using Kac's Random Walk Matrices [article]

Jack Parker-Holder, Sam Gass
2018 arXiv   pre-print
However, despite the recent advancements in hardware, deep neural networks remain computationally intensive.  ...  Recent work has shown that by preserving the angular distance between vectors, random feature maps are able to reduce dimensionality without introducing bias to the estimator.  ...  Acknowledgements We could not have done any of this work without the inspiration and guidance of Krzysztof Choromanski, who introduced us to the beauty of random feature maps.  ... 
arXiv:1801.02764v3 fatcat:y4xzyhujcrbnhav7gppipwhsp4

Optimal information loading into working memory in prefrontal cortex [article]

Jake P Stroud, Kei Watanabe, Takafumi Suzuki, Mark G Stokes, Máté Lengyel
2021 bioRxiv   pre-print
The neural circuit mechanisms underlying this information maintenance are thought to rely on persistent activities resulting from attractor dynamics.  ...  Extended Data Fig. 8 | Full-delay trained networks. a, Cost function for full-delay training on the random delay task (Methods).  ...  Neural networks and physical sys-tems with emergent collective computational abili-ties. Proceedings of the National Academy of Sci-Randomization.  ... 
doi:10.1101/2021.11.16.468360 fatcat:jpixppojk5ab5hz3pmhb4xwygm

Predictive Coding as Stimulus Avoidance in Spiking Neural Networks [article]

Atsushi Masumori, Lana Sinapayen, Takashi Ikegami
2019 arXiv   pre-print
In this study, we demonstrate that spiking neural networks with random structure spontaneously learn to predict temporal sequences of stimuli based solely on STDP.  ...  Our previous studies showed that action and selection for stimulation avoidance emerge in spiking neural networks through spike-timing dependent plasticity (STDP).  ...  Thus, spiking neural networks without specific structure (random networks) can spontaneously learn to predict simple stimulus sequences based solely on STDP. IV.  ... 
arXiv:1911.09230v1 fatcat:hdiu6m2u6bf4rm7x5pc5okpd2e

Extended Anderson Criticality in Heavy-Tailed Neural Networks [article]

Asem Wardak, Pulin Gong
2022 arXiv   pre-print
We investigate the emergence of complex dynamics in networks with heavy-tailed connectivity by developing a non-Hermitian random matrix theory.  ...  We show that the rich nonlinear response properties of the extended critical regime can account for a variety of neural dynamics such as the diversity of timescales, providing a computational advantage  ...  The extended Anderson critical regime in heavy-tailed random neural networks.  ... 
arXiv:2202.05527v1 fatcat:jbwuy44kwjexzok4ezskn5kmpu

A Novel Recurrent Neural Network for Solving Nonlinear Optimization Problems With Inequality Constraints

Youshen Xia, Gang Feng, Jun Wang
2008 IEEE Transactions on Neural Networks  
This paper presents a novel recurrent neural network for solving nonlinear optimization problems with inequality constraints.  ...  Simulation results show the effectiveness of the proposed neural network in solving nonlinearly constrained optimization problems.  ...  The extended projection neural network models for solving (1) can be described by (10) It is easy to see that the extended projection neural network model (10) has the same network complexity with the  ... 
doi:10.1109/tnn.2008.2000273 pmid:18701366 fatcat:njhz6u63mrcstbbplkjr2yuiby

COVID-Net US-X: Enhanced Deep Neural Network for Detection of COVID-19 Patient Cases from Convex Ultrasound Imaging Through Extended Linear-Convex Ultrasound Augmentation Learning [article]

E. Zhixuan Zeng, Adrian Florea, Alexander Wong
2022 arXiv   pre-print
In this study, we explore the impact of leveraging extended linear-convex ultrasound augmentation learning on producing enhanced deep neural networks for COVID-19 assessment, where we conduct data augmentation  ...  A major challenge to building deep neural networks for COVID-19 screening using POCUS is the heterogeneity in the types of probes used to capture ultrasound images (e.g., convex vs. linear probes), which  ...  To generate a random transformation: 1.  ... 
arXiv:2204.13851v1 fatcat:vbvw2khjt5cc5epztbvukyz6ru

Controlling the dynamics of multi-state neural networks

Tao Jin, Hong Zhao
2008 Journal of Statistical Mechanics: Theory and Experiment  
We then extend a design rule, which was presented recently for designing binary-state neural networks, to make it suitable for designing general multi-state neural networks.  ...  These dynamic behaviors are also observed in the binary-state neural networks; therefore, our results imply that they may be the universal behaviors of feedback neural networks.  ...  For designing multi-state neural networks, we should extend the original algorithm of the MCA rule.  ... 
doi:10.1088/1742-5468/2008/06/p06002 fatcat:sb3ajsetzjfphgjehrlikzgjpm

Page 155 of Neural Computation Vol. 5, Issue 1 [page]

1993 Neural Computation  
Learning in the Recurrent Random Neural Network 155 of neural networks to control and optimization (Gelenbe and Batty 1992) and network learning.  ...  The work presented in this pa- per extends this approach to the random network model (Gelenbe 1989, 1990), which has the advantage of possessing well-defined fixed-point equations representing the stationary  ... 

Networked Model Predictive Control Using a Wavelet Neural Network [article]

H. Khodabandehlou, M. Sami Fadali
2018 arXiv   pre-print
Simulation results show that the MPC with extended prediction horizon can effectively control the system in the presence of fixed or random network delay.  ...  The wavelet neural network (WNN) performs the online identification of the nonlinear system.  ...  Simulation results show that the model predictive controller with extended prediction horizon can successfully mitigate the effect of fixed and random network delay.  ... 
arXiv:1805.04549v1 fatcat:tzae42srmzbbjhhsv4poo3j2ia

Coupling parameter in synchronization of diluted neural networks

Qi Li, Yong Chen, Ying Hai Wang
2002 Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics  
We study the critical features of coupling parameter in the synchronization of neural networks with diluted synapses.  ...  coupling intensity for synchronization in this spatially extended system.  ...  We consider a neural network model that consists of N analog neurons x i (t) ∈ [0, 1] , i = 1, . . . , N . Each neuron x i is connected with other neurons x j by a random weighted coupling J ij .  ... 
doi:10.1103/physreve.65.041916 pmid:12005882 fatcat:zj4gvtt6vzgsrn7t2capsmcnrq

Deep Randomized Neural Networks [article]

Claudio Gallicchio, Simone Scardapane
2021 arXiv   pre-print
In recent years, the study of Randomized Neural Networks has been extended towards deep architectures, opening new research directions to the design of effective yet extremely efficient deep learning models  ...  Randomized Neural Networks explore the behavior of neural systems where the majority of connections are fixed, either in a stochastic or a deterministic fashion.  ...  We now turn to the topic of deep randomized recurrent neural networks.  ... 
arXiv:2002.12287v2 fatcat:iy6r4bzka5bitjdugsfgbpioia

On Interference of Signals and Generalization in Feedforward Neural Networks [article]

Artur Rataj
2003 arXiv   pre-print
A test of a feedforward neural network is performed that shows the discussed random generalization.  ...  This study is done on the basis of a feedforward artificial neural network.  ...  neural network output vectors.  ... 
arXiv:cs/0310009v3 fatcat:qj2oerremjg2ph2cbtl3qbgdpa

Hierarchical modular structure enhances the robustness of self-organized criticality in neural networks

Sheng-Jun Wang, Changsong Zhou
2012 New Journal of Physics  
Here, we find that networks with modular structure can extend the parameter region of coupling strength over which critical states are reached compared to non-modular networks.  ...  One of the most prominent architecture properties of neural networks in the brain is the hierarchical modular structure. How does the structure property constrain or improve brain function?  ...  It is notable that the region of small deviation can be extended by the modular structure compared with random networks. When the rewiring probability is too large, e.g.  ... 
doi:10.1088/1367-2630/14/2/023005 fatcat:vwcosmy73vbb3hmnh5uuqkcc6m
« Previous Showing results 1 — 15 out of 268,938 results