Filters








2,774 Hits in 3.9 sec

Spatiotemporal Computations of an Excitable and Plastic Brain: Neuronal Plasticity Leads to Noise-Robust and Noise-Constructive Computations

Hazem Toutounji, Gordon Pipa, Olaf Sporns
2014 PLoS Computational Biology  
Nevertheless, no unifying account exists of how neurons in a recurrent cortical network learn to compute on temporally and spatially extended stimuli.  ...  To that end, we rigorously formulate the problem of neural representations as a relation in space between stimulus-induced neural activity and the asymptotic dynamics of excitable cortical networks.  ...  The network recurrency also provides a sustained but damped trace of past inputs (echo state [29] or fading memory [30] ) to propagate through the network.  ... 
doi:10.1371/journal.pcbi.1003512 pmid:24651447 pmcid:PMC3961183 fatcat:6dft4ndyifeqvpcy73qnpxzxoi

A solution to the learning dilemma for recurrent networks of spiking neurons [article]

Guillaume Bellec, Franz Scherr, Anand Subramoney, Elias Hajek, Darjan Salaj, Robert Legenstein, Wolfgang Maass
2019 bioRxiv   pre-print
The resulting learning method -- called e-prop -- approaches the performance of BPTT (backpropagation through time), the best known method for training recurrent neural networks in machine learning.  ...  Recurrently connected networks of spiking neurons underlie the astounding information processing capabilities of the brain.  ...  This learning method for recurrent neural networks is called backpropagation through time (BPTT ) .  ... 
doi:10.1101/738385 fatcat:nilbsax3ozdu7jgm73bwwiqpbm

Insolvency Prediction Analysis of Italian Small Firms by Deep Learning

Agostino Di Ciaccio, Giovanni Cialone
2019 International Journal of Data Mining & Knowledge Management Process  
The study has been conducted using Gradient Boosting, Random Forests, Logistic Regression and some deep learning techniques: Convolutional Neural Networks and Recurrent Neural Networks.  ...  Gradient Boosting was the preferred model, although an increase in observation times would probably favour Recurrent Neural Networks.  ...  The use of Convolutional Neural Networks and recurrent Neural Networks allows to consider, even if in a different way, the temporal relation existing between the observed data.  ... 
doi:10.5121/ijdkp.2019.9601 fatcat:3e2uolho4jchjf75wngp6r6xy4

SWDE : A Sub-Word And Document Embedding Based Engine for Clickbait Detection [article]

Vaibhav Kumar, Mrinal Dhar, Dhruv Khattar, Yash Kumar Lal, Abhimanshu Mishra, Manish Shrivastava, Vasudeva Varma
2018 arXiv   pre-print
Finally, this representation is passed through a neural network to obtain a score for the headline.  ...  We generate sub-word level embeddings of the title using Convolutional Neural Networks and use them to train a bidirectional LSTM architecture.  ...  Bidirectional LSTM with Attention Recurrent Neural Network (RNN) is a class of artificial neural networks which utilizes sequential information and maintains history through its intermediate layers.  ... 
arXiv:1808.00957v1 fatcat:2zz3dmrvtbdbbdo3jzfftpfhd4

Activity-dependent Extrinsic Regulation of Adult Olfactory Bulb and Hippocampal Neurogenesis

Dengke K. Ma, Woon Ryoung Kim, Guo-li Ming, Hongjun Song
2009 Annals of the New York Academy of Sciences  
Here we review extrinsic mechanisms through which adult neurogenesis is regulated by environmental cues, physiological learning-related stimuli, and neuronal activities.  ...  Adult neurogenesis, a highly dynamic process, has been shown to be exquisitely modulated by neuronal circuit activity at different stages, from proliferation of adult neural progenitors, to differentiation  ...  Transient amplifying cells give rise to neuroblasts, migrating toward the OB through the rostral migratory stream (RMS).  ... 
doi:10.1111/j.1749-6632.2009.04373.x pmid:19686209 pmcid:PMC2729764 fatcat:4xg5vq3sczgufdypbdiw4njdbq

Probabilistic associative learning suffices for learning the temporal structure of multiple sequences

Ramon H. Martinez, Anders Lansner, Pawel Herman, Abigail Morrison
2019 PLoS ONE  
To this end, we employ a parsimonious firing-rate attractor network equipped with the Hebbian-like Bayesian Confidence Propagating Neural Network (BCPNN) learning rule relying on synaptic traces with asymmetric  ...  The proposed network model is able to encode and reproduce temporal aspects of the input, and offers internal control of the recall dynamics by gain modulation.  ...  by means of the combination of the Bayesian Confidence Propagating Neural Network (BCPNN) learning mechanism [46] and asymmetrical temporal synaptic traces.  ... 
doi:10.1371/journal.pone.0220161 pmid:31369571 pmcid:PMC6675053 fatcat:v6u2bi4rqnblndg6cn2pr6i2kq

Guided-GAN: Adversarial Representation Learning for Activity Recognition with Wearables [article]

Alireza Abedin, Hamid Rezatofighi, Damith C. Ranasinghe
2021 arXiv   pre-print
In this paper: we explore generative adversarial network (GAN) paradigms to learn unsupervised feature representations from wearable sensor data; and design a new GAN framework-Geometrically-Guided GAN  ...  All methods employ recurrent neural networks. ( * ) RFAAE is our recurrent adaptation of [16] . Recurrent Representation Seq. MNIST UCI HAR USC-HAD Learning Method Acc. f1-score Acc.  ...  Recurrent Generative Adversarial Networks The standard GAN [11] comprises of two parameterized feed-forward neural networks-a generator G φ and a discriminator D ω -competing against one another in a  ... 
arXiv:2110.05732v1 fatcat:6syrhqvibjecrcdfkv42geh3vi

Linking Structure and Function in Macroscale Brain Networks

Laura E. Suárez, Ross D. Markello, Richard F. Betzel, Bratislav Misic
2020 Trends in Cognitive Sciences  
Structural network reconstructions enriched with local molecular and cellular metadata, in concert with more nuanced representations of functions and properties, hold great potential for a truly multiscale  ...  However, network neuroscience research suggests that there is an imperfect link between structural connectivity and functional connectivity in the brain.  ...  The resulting adjustments in neural gain fundamentally alter how signals are routed through the network, how they are transformed, and ultimately, how they are integrated [113, 114] .  ... 
doi:10.1016/j.tics.2020.01.008 pmid:32160567 fatcat:wgrgy7xwsbgillhml2ou3b5xji

Probabilistic associative learning suffices for learning the temporal structure of multiple sequences [article]

Ramon Heberto Martinez, Pawel Herman, Anders Lansner
2019 bioRxiv   pre-print
To this end, we employ a parsimonious firing-rate attractor network equipped with the Hebbian-like Bayesian Confidence Propagating Neural Network (BCPNN) learning rule relying on synaptic traces with asymmetric  ...  temporal characteristics.  ...  the network becomes more sensitive.  ... 
doi:10.1101/545871 fatcat:6dg62beoc5hs5gbimxqwgwbew4

Dynamic patterns of correlated activity in the prefrontal cortex encode information about social behavior

Nicholas A Frost, Anna Haggart, Vikaas S Sohal
2021 PLoS Biology  
Notably, unlike optimal linear classifiers, a neural network classifier with a single linear hidden layer can discriminate network states which differ solely in patterns of coactivity, and not in the activity  ...  We developed an approach that combines a neural network classifier and surrogate (shuffled) datasets to characterize how neurons synergistically transmit information about social behavior.  ...  As discussed above, in the neocortex correlations and coactivity likely reflect recurrent neural network connectivity [39] .  ... 
doi:10.1371/journal.pbio.3001235 pmid:33939689 pmcid:PMC8118626 fatcat:ouvpwtbs5rf7taxvpf4qq7mdhi

Intrinsic excitation-inhibition imbalance affects medial prefrontal cortex differently in autistic men versus women [article]

Stavros Trakoshis, Pablo Martínez-Cañada, Federico Rocchi, Carola Canella, Wonsang You, Bhismadev Chakrabarti, Amber NV Ruigrok, Ed Bullmore, John Suckling, Marija Markicevic, Valerio Zerbi, Simon Baron-Cohen (+5 others)
2020 biorxiv/medrxiv   pre-print
In-silico modeling of local field potentials (LFP) from recurrent networks of interacting excitatory and inhibitory neurons shows that increasing the E:I ratio by specifically enhancing excitation attenuates  ...  Scale-free metrics of neural time-series data could represent biomarkers for E:I imbalance and could enable a greater understanding of how E:I imbalance affects different types of autistic individuals  ...  Results Analysis of E:I balance in simulated LFPs from a recurrent network model In a bottom-up fashion, we first worked to identify potential biomarkers of E:I imbalance from neural time-series data such  ... 
doi:10.1101/2020.01.16.909531 fatcat:mur5nr6ohbflhesnlv2wyxxsfm

Bio-inspired computer vision: Towards a synergistic approach of artificial and biological vision

N. V. Kartheek Medathati, Heiko Neumann, Guillaume S. Masson, Pierre Kornprobst
2016 Computer Vision and Image Understanding  
The dorsal pathway runs towards the parietal cortex, through motion areas MT and MST. The ventral pathway propagates through area V4 all along the temporal cortex, reaching area IT.  ...  The magnocellular (M) pathway conveys coarse, luminance-based spatial inputs with a strong temporal sensitivity towards Layer 4C α of area V1 where a characteristic population of cells, called stellate  ... 
doi:10.1016/j.cviu.2016.04.009 fatcat:lad5bwlqgbb5nhgtnxj6d32mc4

Integrating Various Neural Features Based on Mechanism of Intricate Balance and Ongoing Activity: Unified Neural Account Underlying and Correspondent to Mental Phenomena

Tien-Wen Lee, Gerald Tramontano
2021 World Journal of Neuroscience  
In recent decades, brain science has been enriched from both empirical and computational approaches.  ...  The enrichment of all the observation is inspired by empirical as well as theoretical neuroscience.  ...  Since chaotic dynamics is sensitive to initial condition, neuronal/neural code is different from random perturbation because it may guide the trajectory toward a pertinent attractor.  ... 
doi:10.4236/wjns.2021.112014 fatcat:j2etoe7aznf7ddleuuidm6e4ma

Intrinsic excitation-inhibition imbalance affects medial prefrontal cortex differently in autistic men versus women

Stavros Trakoshis, Pablo Martínez-Cañada, Federico Rocchi, Carola Canella, Wonsang You, Bhismadev Chakrabarti, Amber NV Ruigrok, Edward T Bullmore, John Suckling, Marija Markicevic, Valerio Zerbi, Anthony J Bailey (+35 others)
2020 eLife  
Results Analysis of E:I balance in simulated LFPs from a recurrent network model In a bottom-up fashion, we first worked to identify potential biomarkers of E:I imbalance from neural time-series data such  ...  Panel A shows a Venn diagram depicting the enrichment between autism-associated genes affecting excitatory neurons (Autism E-Genes) and DHT-sensitive genes.  ... 
doi:10.7554/elife.55684 pmid:32746967 fatcat:qox6uq2s7bfhfpvizmyokcl2pe

Real-time diameter of the fetal aorta from ultrasound

Nicoló Savioli, Enrico Grisan, Silvia Visentin, Erich Cosmi, Giovanni Montana, Pablo Lamata
2019 Neural computing & applications (Print)  
We propose a neural network architecture consisting of three blocks: a convolutional neural network (CNN) for the extraction of imaging features, a convolution gated recurrent unit (C-GRU) for exploiting  ...  the temporal redundancy of the signal, and a regularized loss function, called CyclicLoss, to impose our prior knowledge about the periodicity of the observed signal.  ...  The exploitation of temporal redundancy on US sequences was shown to be a solution for improving overall detection results of the fetal heart [22] , where a CNN coupled with a recurrent neural network  ... 
doi:10.1007/s00521-019-04646-3 pmid:32523256 pmcid:PMC7260154 fatcat:3kgzpuwlhbf4hdim52exbdxyce
« Previous Showing results 1 — 15 out of 2,774 results