A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is
In this paper we explore the node complexity of recursive neural network implementations of frontier-to-root tree automata (FRA). ... Specifically, we show that an FRAO (Mealy version) with m states, l input-output labels, and maximum rank N can be implemented by a recursive neural network with O( (log l+log m)lm log l+N log m ) units ... Recursive Neural Networks The processing of trees 2 by using neural networks can be understood by analogy with tree automata. ...doi:10.1109/72.809076 pmid:18252632 fatcat:ez2kw2zfmrcsrchf6ly6oglahy
Lecture Notes in Computer Science
more importantly the discovery of new algorithms that can infer a variety of types of grammars and automata from heterogeneous data. ... Grammatical inference has historically found it's first theoretical results in the field of inductive inference, but it's first applications in the one of Syntactic and Structural Pattern Recognition. ... Acknowledgements The author would like to thank Laurent Miclet for helping him understand some of the different links between Grammatical Inference and Structural Pattern Recognition. ...doi:10.1007/3-540-44522-6_3 fatcat:hd42aiumwrhgtaa2rm3npqo2vy
We introduce a graphical formalism for representing this class of adaptive transductions by means of recursive networks, i.e., cyclic graphs where nodes are labeled by variables and edges are labeled by ... Structures are processed by unfolding the recursive network into an acyclic graph called encoding network. ... Tsoi for very fruitful discussions and comments on an earlier version of this paper. ...doi:10.1109/72.712151 pmid:18255765 fatcat:6bnfo4dlwbdznphuf42f2p4qzq
Lecture Notes in Computer Science
Inference and learning algorithms might be inherited from general algorithms for inference and learning in Bayesian networks. ... In fact, it is now widely recognized that Markovian models are just a special case of probabilistic belief networks  . ... Moreover, neural networks are known to be capable of robustly representing finite automata [16, 17] and interesting links have been established between symbolic and connectionist grammatical inference ...doi:10.1007/bfb0053996 fatcat:kam5jnaoz5dpxdf56gykf3cae4
We then provide a denotational semantics which uses a form of trace, augmented with information about enabledness, and is related to the failures model for CSP and to Hennessy's acceptance trees. ... By introducing closure conditions on trace sets, we achieve full abstraction: two processes have the same meaning if and only if they exhibit identical behaviors in all contexts. ... The types inferred are recursively constrained types, types that come with a system of constraints. ...doi:10.1016/0304-3975(96)80119-8 fatcat:ka6h7uxc3bbtpibrfe4nsa73p4
In this paper we analyze methodological and philosophical implications of algorithmic aspects of unconventional computation. ... In contrast, the open algorithmic universe, and even more the open world of algorithmic constellations, remove such restrictions and enable new, richer understanding of computation. ... Acknowledgements The authors would like to thank Andree Ehresmann, Hector Zenil and Marcin Schroeder for useful and constructive comments on the previous version of this work. ...doi:10.1007/978-3-642-37225-4_16 fatcat:yfy26okdojggxiruih6cpya34q
networks" an incremental tree is usually obtained, by using a set of rules for connecting a possible parse tree to the previously obtained incremental tree. ... Parsing with finite automata networks implies, in one way, the conversion of a regular expression into a minimal deterministic finite automaton, while parsing with neural networks involves parsing of a ... Thereby, this twofold study will be able to discuss the inference of graph grammars in parsing with neural and finite automata networks. ...doi:10.5120/2878-3747 fatcat:ikw5hpj255acpmuucnd2giv2du
identification criteria for inductive inference of recursive real-valued functions (262-275); John Case, Sanjay Jain [Sanjay Jain'], Susanne Kaufmann, Arun Sharma and Frank Stephan, Pre- dictive learning ... Verbeurgt, Learning sub-classes of mono- tone DNF on the uniform distribution (385-399); Ugis Sarkans and Janis Barzdin’, Using attribute grammars for description of inductive inference search space (400 ...
One of the main feature of this algorithm is the application of automata theory to formalize the problem of decision tree induction and the use of a hybrid approach, which integrates both syntactical and ... This paper introduces a new algorithm for the induction of decision trees, based on adaptive techniques. ... Hybrid trees are even more general, as they associate each leaf to an artificial neural network. This work concentrates on the still more popular discrete decision trees. ...doi:10.19153/cleiej.6.1.4 fatcat:2w2xuc4ednfwdcyfg3i3my4asy
Siegelmann, Recurrent neural networks (29-45); W. F. ... Pinter, Complexity of network training for classes of neural networks (215-227); Ricard Gavalda and David Guijarro, Learning ordered binary decision diagrams (228-238); Jorge Cas- tro and José L. ...
325-338); Takeshi Shinohara, Inductive infer- ence of monotonic formal systems from positive data (339-351); Shuling Liu and Masami Hagiya, Model inference of constrained recursive figures (355-367); Stephen ... programs by higher-order and semantic unification (396-410); Atsushi Togashi 68 COMPUTER SCIENCE 94i:68003 and Shoichi Noguchi, Inductive inference of term rewriting sys- tems realizing algebras (411- ...
“An approach is presented in the particular area of inductive inference of recursive functions. Interactive scenarios of validating inductive inference algorithms are formalized. ... As an example to demonstrate the acquisition of descriptive adequacy by neural networks, we deal with the inference of well-formedness of an aritificial language close to English. ...
Lecture Notes in Computer Science
Neural networks are artificial intelligence tools which already support automatic inference for successful applications of statistical pattern recognition. ... In this paper, we suggest tliat neural networks, and specifically Cascade-Correlation, can be used for automatic inference in syntactic and structural pattern recognition, as well. ... Acknowledgment We would like to thank Christoph Goller for the generation of the training and test sets used in this paper. ...doi:10.1007/3-540-61577-6_10 fatcat:myc6eesbfnfotjcbggtoe5tbca
, largest -, 479 inference from a walk, 3097 tree language, 3535 representation. 203 I theory, 825 tree languages, 1007, 1431,2031,2407,2415,24S9, 3275 recognized by synchronized tree automata, ... languages, 477 flowchart. 1062 systolic language recognition, 453 by tree automata, 453 systolic networks, 730 design of -, 730 systolic ring, 730 schemes, 1062 system, 633 systems, 1062, 1333 ...doi:10.1016/s0304-3975(98)00319-3 fatcat:s22ud3iiqjht7lfbtc3zctk7zm
In contrast, recursive neural networks (recursive-NNs) are, theoretically, capable of achieving better extrapolation due to their tree-like design but are difficult to optimize as the depth of their underlying ... Artificial neural networks, including recurrent networks and transformers, struggle to generalize on these kinds of difficult compositional problems, often exhibiting poor extrapolation performance. ... A recursive neural network (recursive-NN) is a kind of tree-structured neural architecture in which each node is represented by an ANN (see Figure 1 ). ...arXiv:2104.02899v1 fatcat:2dk2hfwutjchvc7zjeucosxc7q
« Previous Showing results 1 — 15 out of 603 results