Filters








4,672 Hits in 8.4 sec

Generalization by symbolic abstraction in cascaded recurrent networks

Mikael Bodén
2004 Neurocomputing  
Generalization performance in recurrent neural networks is enhanced by cascading several networks.  ...  By discretizing abstractions induced in one network, other networks can operate on a coarse symbolic level with increased performance on sparse and structural prediction tasks.  ...  Acknowledgments The author gratefully acknowledges the insightful comments made by two anonymous reviewers.  ... 
doi:10.1016/j.neucom.2004.01.006 fatcat:bjz2rbbty5gnxa4ma7n3annbsu

Page 1176 of Psychological Abstracts Vol. 92, Issue 4 [page]

2005 Psychological Abstracts  
Generalization performance in recurrent neural networks is enhanced by cascading several networks.  ...  (School of Information Technology and Electri- cal Engineering, University of Queensland, Australia) Generalization by symbolic abstraction in cascaded recurrent networks.  ... 

Neural Network Simulation of Infant Familiarization to Artificial Sentences: Rule-Like Behavior Without Explicit Rules and Variables

Thomas R. Shultz, Alan C. Bale
2001 Infancy  
A fundamental issue in cognitive science is whether human cognitive processing is better explained by symbolic rules or by subsymbolic neural networks.  ...  A recent study of infant familiarization to sentences in an artificial language seems to have produced data that can only be explained by symbolic rule learning and not by unstructured neural networks  ...  ACKNOWLEDGMENTS This research was supported by a grant from the Natural Sciences and Engineering Research Council of Canada.  ... 
doi:10.1207/s15327078in0204_07 pmid:33451192 fatcat:nciej4ujt5cnbp37z7pp6vshqm

On the Need for a Neural Abstract Machine [chapter]

Diego Sona, Alessandro Sperduti
2000 Lecture Notes in Computer Science  
Flowchart for the processing of sequences by a recurrent network. The feedforward encoding network, for each sequence, is generated by unfolding the recurrent network on the input sequence.  ...  Flowchart for the processing of sequences by a recurrent network. The feedforward encoding network, for each sequence, is generated by unfolding the recurrent network on the input sequence.  ... 
doi:10.1007/3-540-44565-x_7 fatcat:65bk3alsk5a2bhgdzdrsevkbhi

Universal Approximation Capability of Cascade Correlation for Structures

Barbara Hammer, Alessio Micheli, Alessandro Sperduti
2005 Neural Computation  
Because of this fact, RNNs can in general not be embedded into RCC networks: the two models differ with respect to the network architectures.  ...  Thus, the architecture is determined automatically and only small parts of the network are adapted during a training cycle. 6 As already mentioned, RCC networks have been generalized to recursive cascade  ...  These models have been proposed in the literature based on recurrent or recursive neural networks in analogy to simple cascade correlation.  ... 
doi:10.1162/0899766053491878 fatcat:vahuuyjt4bhbthdsgxh2hxqbyi

CMAP: Complement Map Database

Kun Yang, Ashok R. Dinasarapu, Edimara S. Reis, Robert A. DeAngelis, Daniel Ricklin, Shankar Subramaniam, John D. Lambris
2013 Computer applications in the biosciences : CABIOS  
The human complement system is increasingly perceived as an intricate protein network of effectors, inhibitors and regulators that drives critical processes in health and disease and extensively communicates  ...  network and discovering new connections.  ...  ://academic.oup.com/bioinformatics/article-abstract/29/14/1832/229604 by guest on 27 July 2018  ... 
doi:10.1093/bioinformatics/btt269 pmid:23661693 pmcid:PMC3702248 fatcat:geacibo4r5akdihgtf255sx7lm

Can connectionism save constructivism?

Gary F. Marcus
1998 Cognition  
I conclude by sketching a possible alternative.  ...  I then give a formal account of why the models fail to generalize in the ways that humans do. Thus, connectionism, at least in its current form, does not provide any support for constructivism.  ...  This research was partially supported by a Faculty Research grant from the University of Massachusetts.  ... 
doi:10.1016/s0010-0277(98)00018-3 pmid:9677762 fatcat:cls4jebkyrgk7iqrjqarvblnsi

Conscious Intelligence Requires Lifelong Autonomous Programming For General Purposes [article]

Juyang Weng
2020 arXiv   pre-print
Universal Turing Machines [29, 10, 18] are well known in computer science but they are about manual programming for general purposes.  ...  APFGP) by machines, but also enable early-age conscious learning.  ...  In symbolic representations, it is a human to handcraft every abstract concept as a symbol; but DN does not have a human in the "skull". it simply learns, processes, and generates vectors.  ... 
arXiv:2007.00001v1 fatcat:rplywrlfvfcubm4gxkafswkphq

When Coincidence has Meaning: Understanding Emergence Through Networks of Information Token Recurrence [article]

Markus Luczak-Roesch
2019 arXiv   pre-print
Building on the Transcendental Information Cascades approach I outline a tensor theory of the interaction between rare micro-level events and macro-level system changes.  ...  In this paper I conceptualise a novel approach for capturing coincidences between events that have not necessarily an observed causal relationship.  ...  A recurrence network is the interpretation of a recurrence plot of a time-series with N discrete time steps as an undirected network of N nodes.  ... 
arXiv:1911.07642v1 fatcat:on4zwcwlavbwxg3e4tqzbxvqfm

A general framework for adaptive processing of data structures

P. Frasconi, M. Gori, A. Sperduti
1998 IEEE Transactions on Neural Networks  
The general framework proposed in this paper can be regarded as an extension of both recurrent neural networks and hidden Markov models to the case of acyclic graphs.  ...  A structured organization of information is typically required by symbolic processing.  ...  Another improvement in this direction has been the development of the cascade-correlation network for structure [48] , [19] (a generalization of recurrent cascade-correlation for sequences [50] ) which  ... 
doi:10.1109/72.712151 pmid:18255765 fatcat:6bnfo4dlwbdznphuf42f2p4qzq

Recurrent temporal networks and language acquisition—from corticostriatal neurophysiology to reservoir computing

Peter F. Dominey
2013 Frontiers in Psychology  
A second idea is that recurrent cortical networks with fixed connections can represent arbitrary sequential and temporal structure, which is the basis of the reservoir computing framework.  ...  While aspects of non-human primate and avian interaction clearly constitute communication, this communication appears distinct from the rich, combinatorial and abstract quality of human language.  ...  Serial and temporal structure were learned by the simpler temporal recurrent network (TRN), and the abstract structure was learned by the abstract recurrent network (ARN) which required a working memory  ... 
doi:10.3389/fpsyg.2013.00500 pmid:23935589 pmcid:PMC3733003 fatcat:2l6obfhavrbh5l3mvlgumfqm3m

Rethinking Eliminative Connectionism

Gary F. Marcus
1998 Cognitive Psychology  
theories in favor of descriptions couched in terms of networks of interconnected nodes.  ...  One account of how they are generalized holds that humans possess mechanisms that manipulate symbols and variables; an alternative account holds that symbol-manipulation can be eliminated from scientific  ...  its ability to generalize abstract relationships.  ... 
doi:10.1006/cogp.1998.0694 pmid:9892549 fatcat:h3yovqxxkvcovfoahg37dmpuky

Sequence-to-Sequence Models Can Directly Translate Foreign Speech [article]

Ron J. Weiss, Jan Chorowski, Navdeep Jaitly, Yonghui Wu, Zhifeng Chen
2017 arXiv   pre-print
We present a recurrent encoder-decoder deep neural network architecture that directly translates speech in one language into text in another.  ...  In addition, we find that making use of the training data in both languages by multi-task training sequence-to-sequence speech translation and recognition models with a shared encoder network can improve  ...  In fact, reading out transcriptions in the source language from this abstract representation requires a separate decoder network.  ... 
arXiv:1703.08581v2 fatcat:ykg67geai5bltcuydfsoa4c5km

Unsupervised feature learning for optical character recognition

Devendra K Sahu, C. V. Jawahar
2015 2015 13th International Conference on Document Analysis and Recognition (ICDAR)  
In recent years, whole word recognition based on Recurrent Neural Networks (RNN) has gained popularity. These methods use simple features such as raw pixel values or profiles.  ...  In addition, these novel features also result in better convergence rate of the RNNs.  ...  LSTM networks are also easily trainable unlike general recurrent neural networks.  ... 
doi:10.1109/icdar.2015.7333920 dblp:conf/icdar/SahuJ15 fatcat:353hqn3x6nhuhjgiyc6qq7bb4y

An End-to-end Framework for Audio-to-Score Music Transcription on Monophonic Excerpts

Miguel A. Román, Antonio Pertusa, Jorge Calvo-Zaragoza
2018 Zenodo  
The proposed method is based on a Convolutional Recurrent Neural Network architecture directly trained with pairs of spectrograms and their corresponding symbolic scores in Western notation.  ...  In this work, we present an end-to-end framework for audio-to-score transcription.  ...  of the RNN, which leads to a Convolutional Recurrent Neural Network (CRNN).  ... 
doi:10.5281/zenodo.1492337 fatcat:flzhe2sjmbdkhnvvxnthpd34ya
« Previous Showing results 1 — 15 out of 4,672 results