Filters








1,646 Hits in 4.8 sec

On the Realization of Compositionality in Neural Networks

Joris Baan, Jana Leible, Mitja Nikolaus, David Rau, Dennis Ulmer, Tim Baumgärtner, Dieuwke Hupkes, Elia Bruni
2019 Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP   unpublished
On the Realization of Compositionality in Neural Networks Baan, J.; Leible, J.; Nikolaus, M.; Rau, D.; Ulmer, D.; Baumgärtner, T.; Hupkes, D.; Bruni, E.  ...  We then do an in-depth analysis of the structural differences between the two model types, focusing in particular on the organisation of the parameter space and the hidden layer activations and find noticeable  ...  Acknowledgements DH is funded by the Netherlands Organization for Scientific Research (NWO), through a Gravitation Grant 024.001.006 to the Language in Interaction Consortium.  ... 
doi:10.18653/v1/w19-4814 fatcat:be7tdxikyvevfgdv3wq3yggqeu

Convolutional Neural Networks with Recurrent Neural Filters

Yi Yang
2018 Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing  
We introduce a class of convolutional neural networks (CNNs) that utilize recurrent neural networks (RNNs) as convolution filters.  ...  We show that simple CNN architectures equipped with recurrent neural filters (RNFs) achieve results that are on par with the best published ones on the Stanford Sentiment Treebank and two answer sentence  ...  We thank Kazi Shefaet Rahman, Ozan Irsoy, Chen-Tse Tsai, and Lingjia Deng for their valuable comments on earlier versions of this paper. We also thank the EMNLP reviewers for their helpful feedback.  ... 
doi:10.18653/v1/d18-1109 dblp:conf/emnlp/Yang18 fatcat:r6cbi5bmzncinmkppqrz5qt5xy

Convolutional Neural Networks with Recurrent Neural Filters [article]

Yi Yang
2018 arXiv   pre-print
We introduce a class of convolutional neural networks (CNNs) that utilize recurrent neural networks (RNNs) as convolution filters.  ...  We show that simple CNN architectures equipped with recurrent neural filters (RNFs) achieve results that are on par with the best published ones on the Stanford Sentiment Treebank and two answer sentence  ...  We thank Kazi Shefaet Rahman, Ozan Irsoy, Chen-Tse Tsai, and Lingjia Deng for their valuable comments on earlier versions of this paper. We also thank the EMNLP reviewers for their helpful feedback.  ... 
arXiv:1808.09315v1 fatcat:dir4qt6v5rdmbpkjyrmojrebnq

Linguistic generalization and compositionality in modern artificial neural networks [article]

Marco Baroni
2019 arXiv   pre-print
In the last decade, deep artificial neural networks have achieved astounding performance in many natural language processing tasks.  ...  After reviewing the main innovations characterizing current deep language processing networks, I discuss a set of studies suggesting that deep networks are capable of subtle grammar-dependent generalizations  ...  Indeed, compositionality is conjectured to be a land- mark not only of language but of human thought in general [15, 5, 16], and the compositional abilities of neural networks have been tested on tasks  ... 
arXiv:1904.00157v2 fatcat:7ja6wz474vfmlllijyz5haby6e

Radically Compositional Cognitive Concepts [article]

Toby B. St Clere Smithe
2019 arXiv   pre-print
We therefore propose a radically compositional approach to computational neuroscience, drawing on the methods of applied category theory.  ...  We describe how these tools grant us a means to overcome complexity and improve interpretability, and supply a rigorous common language for scientific modelling, analogous to the type theories of computer  ...  This is immediate from the associativity of the sum in (1) and compositionality of output-input wiring in DDS.  ... 
arXiv:1911.06602v1 fatcat:fcukpo4gbraedjltg54bubp6xq

Neurocompositional computing: From the Central Paradox of Cognition to a new generation of AI systems [article]

Paul Smolensky, R. Thomas McCoy, Roland Fernandez, Matthew Goldrick, Jianfeng Gao
2022 arXiv   pre-print
The widely accepted narrative attributes this progress to massive increases in the quantity of computational and data resources available to support statistical learning in deep artificial neural networks  ...  These have seemed irreconcilable until the recent mathematical discovery that compositionality can be realized not only through discrete methods of symbolic computing, but also through novel forms of continuous  ...  Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation or Microsoft.  ... 
arXiv:2205.01128v1 fatcat:b5z7omfvcba7zgmiu5b64db2qy

A reafferent model of song syntax generation in the Bengalese finch

Alexander Hanuschkin, Markus Diesmann, Abigail Morrison
2010 BMC Neuroscience  
Here, we present a functional network model of the song syntax generation based on realistic spiking neurons.  ...  Hence, our reafferent model demonstrates how compositionality of a system can be realized given neurobiologically realistic assumptions.  ...  of MEXT (Japan).  ... 
doi:10.1186/1471-2202-11-s1-p33 pmcid:PMC3090919 fatcat:nhd2utp3bzev7fbbzqgbo5jhda

Towards Grounding Compositional Concept Structures in Self-organizing Neural Encodings [chapter]

Martin V. Butz, Daniel Zöllner
2021 Proceedings of the International Conference "Sensory Motor Concepts in Language & Cognition"  
Compositionality, on the other hand, determines how concepts are associated with each other in a semantically meaningful and highly exible manner.  ...  While the symbol grounding problem of agreeing on a mapping between symbols and sensory or even sensorimotor grounded concepts has been solved to a large extent, one possibly even deeper open problem remains  ...  An Experiment with a Simulated Robot Platform In a neural network simulation setup, it was shown that a second-order neural network with parametric bias neurons (sNNPB) is able to develop generalized behavioral  ... 
doi:10.1515/9783110720303-012 fatcat:pan67xploba4lord2dwoqfhley

A Compositionality Machine Realized by a Hierarchic Architecture of Synfire Chains

Sven Schrader, Markus Diesmann, Abigail Morrison
2011 Frontiers in Computational Neuroscience  
Moreover, when the lower layer of the network is constructed in a closed-loop fashion, drawing strokes are generated sequentially.  ...  Finally, we investigate the spiking activity of our model to propose candidate signatures of synfire chain computation in measurements of neural activity during action execution.  ...  The question of how such compositional systems are realized in the neural substrate is an unresolved issue.  ... 
doi:10.3389/fncom.2010.00154 pmid:21258641 pmcid:PMC3020397 fatcat:kkeutzodg5hqtllgxetb6fwhka

Symbolic, Distributed, and Distributional Representations for Natural Language Processing in the Era of Deep Learning: A Survey

Lorenzo Ferrone, Fabio Massimo Zanzotto
2020 Frontiers in Robotics and AI  
This is the right time to revitalize the area of interpreting how discrete symbols are represented inside neural networks.  ...  A clearer understanding of the strict link between distributed/distributional representations and symbols may certainly lead to radically new deep learning networks.  ...  Recursive neural networks is then a basic block that is recursively applied on trees like the one in Figure 3 .  ... 
doi:10.3389/frobt.2019.00153 pmid:33501168 pmcid:PMC7805717 fatcat:353mgx2tr5ftxcx2utx776isou

Predicate learning in neural systems: Discovering latent generative structures [article]

Andrea E. Martin, Leonidas A. A. Doumas
2018 arXiv   pre-print
During the process of predicate learning, an artificial neural network exploits the naturally occurring dynamic properties of distributed computing across neuronal assemblies in order to learn predicates  ...  We describe how predicates can be combined generatively using neural oscillations to achieve human-like extrapolation and compositionality in an artificial neural network.  ...  Predicate learning in an artificial neural network relies on exploiting the naturally occurring "neural" oscillations of distributed computation over time.  ... 
arXiv:1810.01127v1 fatcat:3ueslp3wqjcdjb4gajwptgbayy

Conceptual Spaces for Cognitive Architectures: A lingua franca for different levels of representation

Antonio Lieto, Antonio Chella, Marcello Frixione
2017 Biologically Inspired Cognitive Architectures  
During the last decades, many cognitive architectures (CAs) have been realized adopting different assumptions about the organization and the representation of their knowledge level.  ...  of the Knowledge Level in Cognitive Systems and Architectures.  ...  Acknowledgements We thank Salvatore Gaglio and Peter Gärdenfors for the discussions on the topics presented in this article.  ... 
doi:10.1016/j.bica.2016.10.005 fatcat:rrtl457g7zccxnqztrvbk4wtnu

Modelling meaning composition from formalism to mechanism

Andrea E. Martin, Giosuè Baggio
2019 Philosophical Transactions of the Royal Society of London. Biological Sciences  
in the system: parts and wholes exist simultaneously yet independently from one another in the mind and brain.  ...  The debate on whether and how compositional systems could be implemented in minds, brains and machines remains vigorous.  ...  Baroni [48] considers the performance of current deep neural networks on tasks involving generalization.  ... 
doi:10.1098/rstb.2019.0298 pmid:31840588 fatcat:dxhm6xxp3be53eelfwqqhhf674

Integrative synchronization mechanisms in connectionist cognitive neuroarchitectures

Harald Maurer
2016 Computational Cognitive Science  
in the mind, and their continuous, numerical implementation in self-organizing neural networks modelling the neural information processing in the human brain.  ...  Based on the mathematics of nonlinear Dynamical System Theory, neurocognition can be analyzed by convergent fluid and transient neurodynamics in abstract n-dimensional system phase spaces in the form of  ...  implementation in neural networks in the brain.  ... 
doi:10.1186/s40469-016-0010-8 fatcat:qc2vghrrhveyfc5xf52kxfm4hu

Page 2239 of Psychological Abstracts Vol. 92, Issue 6 [page]

2005 Psychological Abstracts  
In this paper, we test whether it is possible to evolve a recurrent neural network con- troller to match the dynamic requirement of the task.  ...  The manner in which this mechanism of compositionality, based on dynamical systems, differs from that considered in conventional linguistics and other synthetic computational models, is discussed in this  ... 
« Previous Showing results 1 — 15 out of 1,646 results