Filters








635 Hits in 2.0 sec

Deep Connectomics Networks: Neural Network Architectures Inspired by Neuronal Networks [article]

Nicholas Roberts, Dian Ang Yap, Vinay Uday Prabhu
2019 arXiv   pre-print
We bridge this gap by presenting initial results of Deep Connectomics Networks (DCNs) as DNNs with topologies inspired by real-world neuronal networks.  ...  We show high classification accuracy obtained by DCNs whose architecture was inspired by the biological neuronal networks of C. Elegans and the mouse visual cortex.  ...  Conclusion We demonstrated initial findings from applying networks inspired by real-world neuronal network topologies to deep learning.  ... 
arXiv:1912.08986v1 fatcat:aiee7np72zh2dh5iekshn7jrk4

FusionNet: A deep fully residual convolutional neural network for image segmentation in connectomics [article]

Tran Minh Quan, David G. C. Hildebrand, Won-Ki Jeong
2016 arXiv   pre-print
In this paper, we introduce a novel deep neural network architecture, FusionNet, for the automatic segmentation of neuronal structures in connectomics data.  ...  deeper network architecture for a more accurate segmentation.  ...  Overall, these related studies inspired us to propose a fully residual convolutional neural network for analyzing connectomic data.  ... 
arXiv:1612.05360v2 fatcat:5y44en7xnbaizbktrhpdrkljfi

FusionNet: A Deep Fully Residual Convolutional Neural Network for Image Segmentation in Connectomics

Tran Minh Quan, David Grant Colburn Hildebrand, Won-Ki Jeong
2021 Frontiers in Computer Science  
Here, we introduce a deep neural network architecture, FusionNet, with a focus on its application to accomplish automatic segmentation of neuronal structures in connectomics data.  ...  This results in a much deeper network architecture and improves segmentation accuracy.  ...  CONCLUSIONS In this paper, we introduced a deep neural network architecture for image segmentation with a focus on connectomics EM image analysis.  ... 
doi:10.3389/fcomp.2021.613981 fatcat:cq7fmvcorzanpfz3tebezifp6u

A Novel Transfer Learning Approach to Enhance Deep Neural Network Classification of Brain Functional Connectomes

Hailong Li, Nehal A. Parikh, Lili He
2018 Frontiers in Neuroscience  
However, whole brain classification that combines brain connectome with deep learning has been hindered by insufficient training samples.  ...  We developed a deep transfer learning neural network (DTL-NN) framework for enhancing the classification of whole brain functional connectivity patterns.  ...  We propose a deep transfer learning neural network (DTL-NN) model by utilizing relatively easy-to-obtain FC patterns from a database of healthy subjects.  ... 
doi:10.3389/fnins.2018.00491 pmid:30087587 pmcid:PMC6066582 fatcat:5pcoso4gzffzvbkw2nathwuw7e

An Adversarial and Densely Dilated Network for Connectomes Segmentation

Ke Chen, Dandan Zhu, Jianwei Lu, Ye Luo
2018 Symmetry  
Specifically, we design densely dilated network (DDN) as the segmentor to allow a deeper architecture and larger receptive fields for more accurate segmentation.  ...  In our paper, we propose a novel connectomes segmentation framework called adversarial and densely dilated network (ADDN) to address these issues.  ...  As far as we know, adversarial neural network is at the first time applied to connectomes segmentation with EM images.  ... 
doi:10.3390/sym10100467 fatcat:ifa6b7ui7vhttbfm7xd7ni3etm

Graph Neural Networks in Network Neuroscience [article]

Alaa Bessadok, Mohamed Ali Mahjoub, Islem Rekik
2021 arXiv   pre-print
Relying on its non-Euclidean data type, graph neural network (GNN) provides a clever way of learning the deep graph structure and it is rapidly becoming the state-of-the-art leading to enhanced performance  ...  We conclude by charting a path toward a better application of GNN models in network neuroscience field for neurological disorder diagnosis and population graph integration.  ...  Recent studies aimed to solve this issue by leveraging deep learning (DL) models such as convolutional neural network (CNN) which is inherently trained in an end-to-end fashion (36) .  ... 
arXiv:2106.03535v1 fatcat:jx7ixd7xjngthaq6qhb25gssm4

Weight Agnostic Neural Networks [article]

Adam Gaier, David Ha
2019 arXiv   pre-print
But how important are the weight parameters of a neural network compared to its architecture?  ...  Not all neural network architectures are created equal, some perform much better than others for certain tasks.  ...  deep network architectures.  ... 
arXiv:1906.04358v2 fatcat:i4qtsknhnjab7p5i5nvmvkvd6y

A Convolutional Network Architecture Driven by Mouse Neuroanatomical Data [article]

Jianghong Shi, Michael A. Buice, Eric Shea-Brown, Stefan Mihalas, Bryan Tripp
2020 bioRxiv   pre-print
Convolutional neural networks trained on object recognition derive some inspiration from the neuroscience of the visual system in primates, and have been used as models of the feedforward computation performed  ...  Since mice are capable of visually guided behavior, this raises questions about the role of architecture in neural computation.  ...  We acknowledge the NIH Graduate training grant in neural computation and engineering (R90DA033461).  ... 
doi:10.1101/2020.10.23.353151 fatcat:oaesivjkkvdz7ewj73gb4gwkqy

A Review of Network Inference Techniques for Neural Activation Time Series [article]

George Panagopoulos
2018 arXiv   pre-print
Furthermore, we add a data mining methodology inspired by influence estimation in social networks as a new supervised learning approach.  ...  For comparison, we use the small version of the Chalearn Connectomics competition, that is accompanied with ground truth connections between neurons.  ...  The architecture of the residual neural network employed [6] can be seen in figure 1 and in detail consists of: .  ... 
arXiv:1806.08212v1 fatcat:j64g5tlhd5af7hwyv45w4lz7qa

Learning function from structure in neuromorphic networks [article]

Laura E Suarez, Blake A Richards, Guillaume Lajoie, Bratislav Misic
2020 bioRxiv   pre-print
Here we reconstruct human brain connectomes using in vivo diffusion-weighted imaging, and use reservoir computing to implement these connectomes as artificial neural networks.  ...  We then train these neuromorphic networks to learn a cognitive task. We show that biologically realistic neural architectures perform optimally when they display critical dynamics.  ...  By studying artificial neural networks with connectome-based architectures, we begin to reveal the functional consequences of brain network topology.  ... 
doi:10.1101/2020.11.10.350876 fatcat:titwdwrdh5el3g3vdmk4f3iogi

The physics of brain network structure, function, and control [article]

Christopher W. Lynn, Danielle S. Bassett
2019 arXiv   pre-print
We begin by considering the organizing principles of brain network architecture instantiated in structural wiring under constraints of symmetry, spatial embedding, and energy minimization.  ...  We next consider models of brain network function that stipulate how neural activity propagates along these structural connections, producing the long-range interactions and collective dynamics that support  ...  Sizemore for artistic inspiration. D.S.B. and C.W.L. acknowledge support from the John D. and Catherine T. MacArthur Foundation, the Alfred P.  ... 
arXiv:1809.06441v3 fatcat:5udhubilgvdfhefiwfwcwzmdy4

A Spiking Neural Network Emulating the Structure of the Oculomotor System Requires No Learning to Control a Biomimetic Robotic Head [article]

Praveenram Balachandar, Konstantinos P. Michmizos
2020 arXiv   pre-print
The controller is unique in the sense that (1) all data are encoded and processed by a spiking neural network (SNN), and (2) by mimicking the associated brain areas' topology, the SNN is biologically interpretable  ...  Leveraging on recent studies on the neural connectome associated with eye movements, we designed a neuromorphic oculomotor controller and placed it at the heart of our in-house biomimetic robotic head  ...  networks (SNN) that seamlessly integrate to non-Von Neumann architectures.  ... 
arXiv:2002.07534v2 fatcat:psvsshwxnzgfxnlo3wx7a4ijy4

Sparsity through evolutionary pruning prevents neuronal networks from overfitting

Richard C. Gerum, André Erpenbeck, Patrick Krauss, Achim Schilling
2020 Neural Networks  
This could be the case because the brain is not a randomly initialized neural network, which has to be trained from scratch by simply investing a lot of calculation power, but has from birth some fixed  ...  However, still the networks fail - in contrast to our brain - to develop general intelligence in the sense of being able to solve several complex tasks with only one network architecture.  ...  Acknowledgments This work was supported by the German Research Foundation (DFG, grant KR5148/2-1 to PK, project number: 436456810), and the Emergent Talents Initiative (ETI) of the University Erlangen-Nuremberg  ... 
doi:10.1016/j.neunet.2020.05.007 pmid:32454374 fatcat:hjx2sv7ejnfx3jfulafdyaqsf4

GSR-Net: Graph Super-Resolution Network for Predicting High-Resolution from Low-Resolution Functional Brain Connectomes [article]

Megi Isallari, Islem Rekik
2020 arXiv   pre-print
Second, inspired by spectral theory, we break the symmetry of the U-Net architecture by topping it up with a graph super-resolution (GSR) layer and two graph convolutional network layers to predict a HR  ...  Catchy but rigorous deep learning architectures were tailored for image super-resolution (SR), however, these fail to generalize to non-Euclidean data such as brain connectomes.  ...  In recent years, advances in deep learning have inspired a multitude of works in image super-resolution ranging from the early approaches using Convolutional Neural Networks (CNN) (e.g.  ... 
arXiv:2009.11080v1 fatcat:5l7usigbfncphmt25yfusii4cm

Design of Neuromorphic Cognitive Module based on Hierarchical Temporal Memory and Demonstrated on Anomaly Detection

Marek Otahal, Michal Najman, Olga Stepankova
2016 Procedia Computer Science  
Our presented idea is to integrate artificial neural network (probably of BICA type) with a real biological network (ideally in the future with the human brain) in order to extend or enhance cognitive-and  ...  sensory-capabilities (e.g. by associating existing and artificial sensory inputs).  ...  "Deep" neural networks Deep Neural Network (DNN) architectures, which gained enermous popularity in recent years, would be the most similar system to HTM, due to the fact that both rely on hierarchies  ... 
doi:10.1016/j.procs.2016.07.430 fatcat:3dtif4xxd5c7niwsgyjyi52cva
« Previous Showing results 1 — 15 out of 635 results