Filters








2,144 Hits in 3.8 sec

Exposition and Interpretation of the Topology of Neural Networks [article]

Rickard Brüel Gabrielsson, Gunnar Carlsson
2019 arXiv   pre-print
Convolutional neural networks (CNN's) are powerful and widely used tools. However, their interpretability is far from ideal.  ...  We use topological data analysis to show that the information encoded in the weights of a CNN can be organized in terms of a topological data model and demonstrate how such information can be interpreted  ...  and speed up the training of networks.  ... 
arXiv:1810.03234v3 fatcat:dhmi62cxebfeffcgwbbowul5n4

Cognition: Differential-geometrical view on neural networks

S. A. Buffalov
1999 Discrete Dynamics in Nature and Society  
A neural network taken as a model of a trainable system appears to be nothing but a dynamical system evolving on a tangent bundle with changeable metrics.  ...  In other words to learn means to change metrics of a definite manifold.  ...  The discretization of Eq. (1) provides two ways for displaying of cognition in the framework of dynamical systems by means of interpretations held in terms of neural network theory: Mathematical caption  ... 
doi:10.1155/s1026022699000060 fatcat:lcrr3o5ulrgq5fmjswajzgyuti

Two's company, three (or more) is a simplex: Algebraic-topological tools for understanding higher-order structure in neural data [article]

Chad Giusti and Robert Ghrist and Danielle S. Bassett
2016 arXiv   pre-print
While rarely mentioned, this fundamental assumption inherently limits the types of neural structure and function that graphs can be used to model.  ...  Based on the exceptional flexibility of the tools and recent ground-breaking insights into neural function, we posit that this framework has the potential to eclipse graph theory in unraveling the fundamental  ...  In the context of neural data, the presence of multiple homology classes indicates potentially interesting structure whose interpretation depends on the meaning of the vertices and simplices in the complex  ... 
arXiv:1601.01704v2 fatcat:2ammj4b6hnhxzm7znj3f4ncf6a

Topology of deep neural networks [article]

Gregory Naitzat, Andrey Zhitnikov, Lek-Heng Lim
2020 arXiv   pre-print
We study how the topology of a data set M = M_a ∪ M_b ⊆R^d, representing two classes a and b in a binary classification problem, changes as it passes through the layers of a well-trained neural network  ...  No matter how complicated the topology of M we begin with, when passed through a well-trained neural network f : R^d →R^p, there is a vast reduction in the Betti numbers of both components M_a and M_b;  ...  ), and the University of Chicago (Chicago-Vienna Faculty Grant and Eckhardt Faculty Fund).  ... 
arXiv:2004.06093v1 fatcat:xybs7clxtvgizdpmkoqttizgj4

A Survey of Topological Machine Learning Methods

Felix Hensel, Michael Moor, Bastian Rieck
2021 Frontiers in Artificial Intelligence  
such as deep neural networks.  ...  The last decade saw an enormous boost in the field of computational topology: methods and concepts from algebraic and differential topology, formerly confined to the realm of pure mathematics, have demonstrated  ...  The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.  ... 
doi:10.3389/frai.2021.681108 pmid:34124648 pmcid:PMC8187791 fatcat:r3d6yjc5hbf53n3rqwxiv54pxq

Combining Dynamic Relaxation Method with Artificial Neural Networks to Enhance Simulation of Tensegrity Structures

Bernd Domer, Etienne Fest, Vikram Lalit, Ian F. C. Smith
2003 Journal of Structural Engineering  
First tests involving training the neural network online showed potential to adapt the model to changes during the service life of the structure.  ...  This paper describes the use of neural networks to improve the accuracy of the dynamic relaxation method in order to correspond more closely to data measured from a full-scale laboratory structure.  ...  The writers gratefully acknowledge the suggestions of Ashok Gupta ͑IIT Delhi͒, which led to test whether the dynamic relaxation method can be completely replaced by neural networks.  ... 
doi:10.1061/(asce)0733-9445(2003)129:5(672) fatcat:udedhl3hmbblpho2ccwutxguua

Two's company, three (or more) is a simplex

Chad Giusti, Robert Ghrist, Danielle S. Bassett
2016 Journal of Computational Neuroscience  
While rarely mentioned, this fundamental assumption inherently limits the types of neural structure and function that graphs can be used to model.  ...  Here, we describe a generalization of graphs that overcomes these limitations, thereby offering a broad range of new possibilities in terms of modeling and measuring neural phenomena.  ...  Acknowledgments RG acknowledges support from the Air Force Office of Scientific Research (FA9550-12-1-0416 and FA9550-14-1-0012) and the Office of Naval Research (NO0014-16-1-2010).  ... 
doi:10.1007/s10827-016-0608-6 pmid:27287487 pmcid:PMC4927616 fatcat:eawtf6mwt5anvoizlc2ssi4zru

Signal Processing on Cell Complexes [article]

T. Mitchell Roddenberry, Michael T. Schaub, Mustafa Hajij
2022 arXiv   pre-print
These Hodge Laplacians enable the construction of convolutional filters, which can be employed in linear filtering and non-linear filtering via neural networks defined on cell complexes.  ...  The processing of signals supported on non-Euclidean domains has attracted large interest recently.  ...  Convolutional Neural Networks for k-chains.  ... 
arXiv:2110.05614v2 fatcat:hspfwejqcffsvhm45txxiqi634

Generative models for network neuroscience: prospects and promise

Richard F. Betzel, Danielle S. Bassett
2017 Journal of the Royal Society Interface  
Next, we review generative models of biological neural networks, both at the cellular and large-scale level, and across a variety of species including Caenorhabditis elegans, Drosophila, mouse, rat, cat  ...  Network neuroscience is the emerging discipline concerned with investigating the complex patterns of interconnections found in neural systems, and identifying principles with which to understand them.  ...  The authors thank Lia Papadopoulos and Evelyn Tang for helpful comments on earlier versions of this manuscript.  ... 
doi:10.1098/rsif.2017.0623 pmid:29187640 pmcid:PMC5721166 fatcat:b6hdzyjjfzhmddzcuwivbxg2we

Generative Models for Network Neuroscience: Prospects and Promise [article]

Richard F. Betzel, Danielle S. Bassett
2017 arXiv   pre-print
Next, we review generative models of biological neural networks, both at the cellular and large-scale level, and across a variety of species including C. elegans, Drosophila, mouse, rat, cat, macaque,  ...  Network neuroscience is the emerging discipline concerned with investigating the complex patterns of interconnections found in neural systems, and to identify principles with which to understand them.  ...  -1626008).The content is solely the responsibility of the authors and does not necessarily represent the official views of any of the funding agencies.  ... 
arXiv:1708.07958v1 fatcat:mgugkgztm5dw5glg22bsutrtwa

Rolling in the Deep Convolutional Neural Networks

Derya Soydemir
2019 International Journal of Intelligent Systems and Applications in Engineering  
Over the past years, convolutional neural networks (CNNs) have achieved remarkable success in deep learning.  ...  However, the exposition of the theoretical calculations behind the convolution operation is rarely emphasized.  ...  Introduction Convolutional neural networks (CNNs) are a specialized kind of neural network for processing data that has a known grid-like topology [1] .  ... 
doi:10.18201/ijisae.2019457674 fatcat:6yp22kc67nct5gfm23g46v7yza

Spatial Embedding Imposes Constraints on the Network Architectures of Neural Systems [article]

Jennifer Stiso, Danielle Bassett
2018 arXiv   pre-print
Here, we review the rules imposed by space on the development of neural networks and show that these rules give rise to a specific set of complex topologies.  ...  Recent evidence demonstrates that the constraints imposed by the physical shape of the brain, and by the mechanical forces at play in its development, have marked effects on the observed network topology  ...  ACKNOWLEDGMENTS We would like to thank Ann Sizemore for her help with Box 2: Applied Algebraic Topology. D.S.B. and J.S. acknowledge support from the John D. and Catherine T.  ... 
arXiv:1807.04691v1 fatcat:b7a3rhllrzanxpbszsf7potjay

Neural Networks on Groups [article]

Stella Rose Biderman
2019 arXiv   pre-print
In this paper we extend the definition of neural networks to general topological groups and prove that neural networks with a single hidden layer and a bounded non-constant activation function can approximate  ...  Although neural networks traditionally are typically used to approximate functions defined over R^n, the successes of graph neural networks, point-cloud neural networks, and manifold deep learning among  ...  and Hanin [2017] and the work on polynomial neural networks of Yarotsky [2018] .  ... 
arXiv:1907.03742v2 fatcat:tgmnjpdcljhpxodvtntg755ada

Zeroth-Order Topological Insights into Iterative Magnitude Pruning [article]

Aishwarya Balwani, Jakob Krzyston
2022 arXiv   pre-print
Modern-day neural networks are famously large, yet also highly redundant and compressible; there exist numerous pruning strategies in the deep learning literature that yield over 90% sparser sub-networks  ...  In this work, we leverage the notion of persistent homology to gain insights into the workings of IMP and show that it inherently encourages retention of those weights which preserve topological information  ...  Acknowledgements The authors thank Georgia Tech's Graduate Student Association and College of Engineering for providing financial support to present this work, and the reviewers for their constructive  ... 
arXiv:2206.06563v2 fatcat:gai62lydhbchpifespxkzw5u7e

Network architectures supporting learnability

Perry Zurn, Danielle S. Bassett
2020 Philosophical Transactions of the Royal Society of London. Biological Sciences  
Are the architectures of optimally learnable networks a topological reflection of the architectures of comparably developed neural networks?  ...  That is, learning is reliant on integrated network architectures at two levels: the epistemic and the computational, or the conceptual and the neural.  ...  We also acknowledge a probing conversation with Roger Myerson, which inspired some of this exposition.  ... 
doi:10.1098/rstb.2019.0323 pmid:32089113 pmcid:PMC7061954 fatcat:jc6t6digbjb7pokat4cedxbczi
« Previous Showing results 1 — 15 out of 2,144 results