Filters








21,821 Hits in 2.9 sec

Gaussian Process Neurons Learn Stochastic Activation Functions [article]

Sebastian Urban, Marcus Basalla, Patrick van der Smagt
2017 arXiv   pre-print
We propose stochastic, non-parametric activation functions that are fully learnable and individual to each neuron.  ...  Complexity and the risk of overfitting are controlled by placing a Gaussian process prior over these functions.  ...  This probabilistic treatment transforms a neuron into a probabilistic unit, which we call Gaussian process neuron (GPN).  ... 
arXiv:1711.11059v1 fatcat:f435diw5fne3lhtkfvsmsr6fsu

Regularizing DNN acoustic models with Gaussian stochastic neurons

Hao Zhang, Yajie Miao, Florian Metze
2015 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
In this paper we investigate the effect of Gaussian stochastic neurons on DNN acoustic modeling.  ...  Gaussian stochastic neurons can give improvement on large data sets where Dropout tends to be less useful.  ...  Intuition and justification of Gaussian stochastic neurons The pre-Gaussian noise perturbation is added before the activation function.  ... 
doi:10.1109/icassp.2015.7178915 dblp:conf/icassp/ZhangMM15 fatcat:3v2rnlpkrjajxox36o5fnxcuda

Information Processing in Neuron with Exponential Distributed Delay

Saket Kumar Choudhary, Sunil Kumar Bharti
2018 International Journal of Machine Learning and Networked Collaborative Engineering  
Spiking activity is analyzed via neuronal information processing rate-code scheme in Section IV.  ...  (11) ) is Gaussian distribution and is asymptotically similar to SVD of LIF neuron with stochastic input (Eq.(15)) i.e. limit , EL ss V p p  .  ... 
doi:10.30991/ijmlnce.2018v02i02.003 fatcat:wewzt7bsinf6hl76l5yafobw6u

Neural Information Processing: between synchrony and chaos

Josep Rossello, Josep Rossello, Vincent Canals, Antoni Morro
2012 Nature Precedings  
functionality.  ...  The brain is characterized by performing many different processing tasks ranging from elaborate processes as pattern recognition, memory or decision-making to more simple functionalities as linear filtering  ...  An example of non-gaussian pattern recognition is shown in Fig. 4b , where we represent the activity of the output neuron of a classifier (the colour map indicate this activity) as a function on the values  ... 
doi:10.1038/npre.2012.6935.1 fatcat:krs3z6uoxnaabmjbdfee64knq4

Neural Information Processing: between synchrony and chaos

Josep Rossello, Josep Rossello, Vincent Canals, Antoni Morro
2012 Nature Precedings  
functionality.  ...  The brain is characterized by performing many different processing tasks ranging from elaborate processes as pattern recognition, memory or decision-making to more simple functionalities as linear filtering  ...  An example of non-gaussian pattern recognition is shown in Fig. 4b , where we represent the activity of the output neuron of a classifier (the colour map indicate this activity) as a function on the values  ... 
doi:10.1038/npre.2012.6935 fatcat:jjvhpbwnwrbnnmjskscrn5dgum

Gaussian Error Linear Units (GELUs) [article]

Dan Hendrycks, Kevin Gimpel
2020 arXiv   pre-print
The GELU activation function is xΦ(x), where Φ(x) the standard Gaussian cumulative distribution function.  ...  We propose the Gaussian Error Linear Unit (GELU), a high-performing neural network activation function.  ...  Since the cumulative distribution function of a Gaussian is often computed with the error function, we define the Gaussian Error Linear Unit (GELU) as GELU(x) = xP (X ≤ x) = xΦ(x) = x · 1 2 1 + erf(x/  ... 
arXiv:1606.08415v4 fatcat:aa6abid7rzc6jhfbatpyqyn24u

Steps Toward Deep Kernel Methods from Infinite Neural Networks [article]

Tamir Hazan, Tommi Jaakkola
2015 arXiv   pre-print
We show that deep infinite layers are naturally aligned with Gaussian processes and kernel methods, and devise stochastic kernels that encode the information of these networks.  ...  The output of each such neuron is the activation of the transfer function ψ x (u) = f ( φ x , u µ ).  ...  Transfer functions imitate the neuron behavior, activating its value whenever its linear input x, w is high enough.  ... 
arXiv:1508.05133v2 fatcat:iwnescjlubenlmntue3uofm3a4

A synapse-centric account of the free energy principle [article]

David Kappel, Christian Tetzlaff
2021 arXiv   pre-print
This synapse-centric account of the FEP predicts that synapses interact with the soma of the post-synaptic neuron through stochastic synaptic releases to probe their behavior and use back-propagating action  ...  The parameters of the learning rules are fully determined by the parameters of the post-synaptic neuron model, suggesting a close interplay between the synaptic and somatic compartment and making precise  ...  (11) is an Ornstein-Uhlenbeck (OU)-process that describes the dynamics of the LIF neuron model with stochastic inputs .  ... 
arXiv:2103.12649v1 fatcat:6m54zrso3bccjicihc4ic5yraa

Inherent Weight Normalization in Stochastic Neural Networks [article]

Georgios Detorakis, Sourav Dutta, Abhishek Khanna, Matthew Jerry, Suman Datta, Emre Neftci
2019 arXiv   pre-print
The always-on stochasticity of the NSM confers the following advantages: the network is identical in the inference and learning phases, making the NSM suitable for online learning, it can exploit stochasticity  ...  Here, we further demonstrate that always-on multiplicative stochasticity combined with simple threshold neurons are sufficient operations for deep neural networks.  ...  NSMs are closely related to MC Dropout, with the exception that the activation function is stochastic and the neurons are binary.  ... 
arXiv:1910.12316v1 fatcat:vrodwxzlv5dxxfkwom4uzgne24

Network Plasticity as Bayesian Inference

David Kappel, Stefan Habenschuss, Robert Legenstein, Wolfgang Maass, Jeff Beck
2015 PLoS Computational Biology  
We propose that inherently stochastic features of synaptic plasticity and spine motility enable cortical networks of neurons to carry out probabilistic inference by sampling from a posterior distribution  ...  The resulting new theory of network plasticity explains from a functional perspective a number of experimental data on stochastic aspects of synaptic plasticity that previously appeared to be quite puzzling  ...  The first two terms can be modeled as drift terms in a stochastic process.  ... 
doi:10.1371/journal.pcbi.1004485 pmid:26545099 pmcid:PMC4636322 fatcat:tpjwj3gjencbvpc5fxixy3za5i

Convolutional Neural Networks Regularized by Correlated Noise [article]

Shamak Dutta, Bryan Tripp, Graham Taylor
2018 arXiv   pre-print
Neurons in the visual cortex are correlated in their variability. The presence of correlation impacts cortical processing because noise cannot be averaged out over many neurons.  ...  Inspired by the cortex, correlation is defined as a function of the distance between neurons and their selectivity.  ...  Studying stochastic neurons is interesting because the effect of stochasticity on learning and computation in artificial neural systems may help us in modeling biological neurons.  ... 
arXiv:1804.00815v1 fatcat:ne4uoktyire53f652fki7bdlge

Stochastic Artificial Intelligence: Review Article [chapter]

T.D. Raheni, P. Thirumoorthi
2020 Deterministic Artificial Intelligence  
Human beings create machines through these intelligent techniques and perform various processes in different fields.  ...  This chapter explains artificial neural network-based adaptive linear neuron networks, back-propagation networks, and radial basis networks.  ...  The sigmoid type of activation function is not used as in the case of the back-propagation algorithm, whereas the radial basis network uses Gaussian function as an activation function.  ... 
doi:10.5772/intechopen.90003 fatcat:wbfuubs3pndcxpgunw2bcpwtke

Analysis on Noisy Boltzmann Machines and Noisy Restricted Boltzmann Machines

Wenhao Lu, Chi-Sing Leung, John Sum
2021 IEEE Access  
) , (34)where η = 1 + σ 2 FIGURE 4 : 4 The measured and approximation stochastic activation functions for the Gaussian case.  ...  BOLTZMANN MACHINE FOR CLASSIFICATION TASKS The BM model is a kind of stochastic recurrent networks. The neurons' states are governed by a stochastic activation function.  ... 
doi:10.1109/access.2021.3102275 fatcat:wfmkl23ymbghfakeejjcpu5hta

SPA: Stochastic Probability Adjustment for System Balance of Unsupervised SNNs [article]

Xingyu Yang, Mingyuan Meng, Shanlin Xiao, Zhiyi Yu
2020 arXiv   pre-print
The movement of synaptic transmitter between different clusters is modeled as a Brownian-like stochastic process in which the transmitter distribution is adaptive at different firing phases.  ...  The SPA maps the synapses and neurons of SNNs into a probability space where a neuron and all connected pre-synapses are represented by a cluster.  ...  In this paper, based on a stochastic process, we propose a Stochastic Probability Adjustment (SPA) system composed of adaptive spiking neuron models and stochastic synapse models with Gaussian-distributed  ... 
arXiv:2010.09690v1 fatcat:pe2yzv3ezfaylgk2odkqppbgxe

Closed-Form Expressions of Some Stochastic Adapting Equations for Nonlinear Adaptive Activation Function Neurons

Simone Fiori
2003 Neural Computation  
In recent works we introduced non-linear adaptive activation function (FAN) artificial neuron models which learn their activation functions in an unsupervised way by information-theoretic adapting rules  ...  We also applied networks of these neurons to some blind signal processing problems, such as independent component analysis and blind deconvolution.  ...  Auto-organization, or unsupervised learning, denotes the activity of spontaneous learning to perform sensory processing, perception and inference.  ... 
doi:10.1162/089976603322518795 pmid:14629873 fatcat:2dlsmmnskzhgrcef72djfmqw4a
« Previous Showing results 1 — 15 out of 21,821 results