A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Learning incremental syntactic structures with recursive neural networks
KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516)
Our proposal relies on a machine learning technique for predicting the correctness of partial syntactic structures that are built during the parsing process. ...
A recursive neural network architecture is employed for computing predictions after a training phase on examples drawn from a corpus of parsed sentences, the Penn Treebank. ...
The structured nature of instances in the learning domain makes this task suitable for the application of recursive neural networks (also known as folding architecture), that can solve the supervised learning ...
doi:10.1109/kes.2000.884088
dblp:conf/kes/CostaFLS00
fatcat:4xhiuttmu5d3nlazszupdvkbcq
Diagnostic Classifiers Revealing how Neural Networks Process Hierarchical Structure
2016
Neural Information Processing Systems
We also show that gated recurrent neural networks, which process the expressions incrementally, perform surprisingly well on this task: they learn to predict the outcome of the arithmetic expressions with ...
We find that recursive neural networks can implement a generalising solution, and we visualise the intermediate steps: projection, summation and squashing. ...
= 0.52 and 0.95 for recursive and incremental, respectively). 4
Conclusions In this paper we studied how recursive and recurrent neural networks process hierarchical structure, using a simple arithmetic ...
dblp:conf/nips/VeldhoenHZ16
fatcat:flkpbf7f5fgcffg7ix2ndtjj64
Wide Coverage Incremental Parsing by Learning Attachment Preferences
[chapter]
2001
Lecture Notes in Computer Science
A recursive neural network is trained on treebank data to learn first pass attachments, and is employed as a heuristic for guiding parsing decision. ...
This paper presents a novel method for wide coverage parsing using an incremental strategy, which is psycholinguistically motivated. ...
Recursive neural networks can adaptively process labeled graphs, and exploit the supervised learning paradigm on structured data. ...
doi:10.1007/3-540-45411-x_30
fatcat:luhjw7uq75h3fdyssvltjt4p5u
Ambiguity Resolution Analysis in Incremental Parsing of Natural Language
2005
IEEE Transactions on Neural Networks
Index Terms-Recursive neural networks, structured data, first pass attachment, incremental parsing, learning preferences. ...
In earlier work we have introduced a recursive neural network capable of performing syntactic ambiguity resolution in incremental parsing. ...
We propose a recursive neural network (RNN) [18] , [19] to learn this preference task 1 . ...
doi:10.1109/tnn.2005.849837
pmid:16121736
fatcat:luutmjvusjcffeyrzdtmprgn5y
Tree-structured composition in neural networks without tree-structured architectures
[article]
2015
arXiv
pre-print
We hypothesize that neural sequence models like LSTMs are in fact able to discover and implicitly use recursive compositional structure, at least for tasks with clear cues to that structure in the data ...
Tree-structured neural networks encode a particular tree geometry for a sentence in the network design. However, these models have at best only slightly outperformed simpler sequence-based models. ...
neural networks, which build representations incrementally according to the hierarchical structure of linguistic phrases [5, 6] . ...
arXiv:1506.04834v3
fatcat:ntcmi6hv2ff7rkrdojoopk42dq
Extended Cascade-Correlation for syntactic and structural pattern recognition
[chapter]
1996
Lecture Notes in Computer Science
In this paper, we suggest tliat neural networks, and specifically Cascade-Correlation, can be used for automatic inference in syntactic and structural pattern recognition, as well. ...
An extended version of a standard neuron which is able to deal with structures is presented and the Cascade-Correlation algorithm generalized to structured domains. ...
The aim of this paper is to suggest that artificial neural networks can be applied as well to syntactic and structural pattern recognition. ...
doi:10.1007/3-540-61577-6_10
fatcat:myc6eesbfnfotjcbggtoe5tbca
Learning first-pass structural attachment preferences with dynamic grammars and recursive neural networks
2003
Cognition
We use a dynamic grammar, which provides a very tight correspondence between grammatical derivations and incremental processing, and recursive neural networks, which are able to deal with the complex hierarchical ...
The model is a hybrid system, which uses symbolic grammars to build and represent syntactic structures, and neural networks to rank these structures on the basis of its experience. ...
In this paper, we will describe a learning method for this task based on recursive neural networks. ...
doi:10.1016/s0010-0277(03)00026-x
pmid:12763317
fatcat:jufsquemavftzm3v4vkywssvwy
On learning an interpreted language with recurrent models
[article]
2021
arXiv
pre-print
We construct simplified datasets reflecting core properties of natural language as modeled in formal syntax and semantics: recursive syntactic structure and compositionality. ...
We find LSTM and GRU networks to generalise to compositional interpretation well, but only in the most favorable learning settings, with a well-paced curriculum, extensive training data, and left-to-right ...
Recurrent neural networks are promising with respect to these desiderata; in a syntactic task (Lakretz et al. 2021) , they demonstrated similar error patterns to humans in processing recursive structures ...
arXiv:1809.04128v3
fatcat:2m35zmhpejcfpabkpqf7kusraa
On Learning Interpreted Languages with Recurrent Models
2022
Computational Linguistics
We construct simplified datasets reflecting core properties of natural language as modeled in formal syntax and semantics: recursive syntactic structure and compositionality. ...
We find LSTM and GRU networks to generalise to compositional interpretation well, but only in the most favorable learning settings, with a well-paced curriculum, extensive training data, and left-to-right ...
Recurrent neural networks are promising with respect to these desiderata; in a syntactic task (Lakretz et al. 2021) , they demonstrated similar error patterns to humans in processing recursive structures ...
doi:10.1162/coli_a_00431
fatcat:hnb2jvw7yfcxne2amq5iych4vm
An Entity-Driven Recursive Neural Network Model for Chinese Discourse Coherence Modeling
2017
International Journal of Artificial Intelligence & Applications
neural network freamework.Evaluation results on both sentence ordering and machine translation coherence rating task show the effectiveness of the proposed model, which significantly outperforms the existing ...
modeling remains a challenge taskin Natural Language Processing field.Existing approaches mostlyfocus on the need for feature engineering, whichadoptthe sophisticated features to capture the logic or syntactic ...
Our future work is to integrate the co reference mechanism into current combined recursive neural network model, together with other coherence evaluation task. ...
doi:10.5121/ijaia.2017.8201
fatcat:7uy2fh5tkvaznageq3it6gvr3u
Connectionist natural language parsing
2002
Trends in Cognitive Sciences
Connectionist parsers are assessed according to their ability to automatically learn from examples to represent syntactic structures, without being presented with symbolic grammar rules. ...
Teaser: The extent to which connectionist systems have succeeded in parsing a wide range of realistic sentences, containing syntactic structures that are commonly found in natural language interaction, ...
consist entirely of one or more neural networks and that learn to process syntax entirely from examples. ...
doi:10.1016/s1364-6613(02)01980-0
pmid:12413578
fatcat:74j5hf54wvgd3ftvh4ei6yzjeu
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
[article]
2015
arXiv
pre-print
Our primary innovation is a new control structure for sequence-to-sequence neural networks---the stack LSTM. ...
We propose a technique for learning representations of parser states in transition-based dependency parsers. ...
Acknowledgments The authors would like to thank Lingpeng Kong and Jacob Eisenstein for comments on an earlier version of this draft and Danqi Chen for assistance with the parsing datasets. ...
arXiv:1505.08075v1
fatcat:goicjmcmsfhnfg2ext2mfygi64
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
2015
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Our primary innovation is a new control structure for sequence-to-sequence neural networksthe stack LSTM. ...
We propose a technique for learning representations of parser states in transitionbased dependency parsers. ...
Acknowledgments The authors would like to thank Lingpeng Kong and Jacob Eisenstein for comments on an earlier version of this draft and Danqi Chen for assistance with the parsing datasets. ...
doi:10.3115/v1/p15-1033
dblp:conf/acl/DyerBLMS15
fatcat:lq5zvojyanektfip2esl7gu7iu
A Survey of Syntactic-Semantic Parsing Based on Constituent and Dependency Structures
[article]
2020
arXiv
pre-print
Constituent parsing is majorly targeted to syntactic analysis, and dependency parsing can handle both syntactic and semantic analysis. ...
This article briefly reviews the representative models of constituent parsing and dependency parsing, and also dependency graph parsing with rich semantics. ...
The recursive neural network is one natural method to model tree-structural outputs, which composes a tree input from bottom-to-up or top-to-down incrementally. ...
arXiv:2006.11056v1
fatcat:pd22rciuxzdc5kvghaapjjyg3u
Parsing with Compositional Vector Grammars
2013
Annual Meeting of the Association for Computational Linguistics
Instead, we introduce a Compositional Vector Grammar (CVG), which combines PCFGs with a syntactically untied recursive neural network that learns syntactico-semantic, compositional vector representations ...
Natural language parsing has typically been done with small sets of discrete categories such as NP and VP, but this representation does not capture the full syntactic nor semantic richness of linguistic ...
The compositional vectors are learned with a new syntactically untied recursive neural network. ...
dblp:conf/acl/SocherBMN13
fatcat:lzb6pgvbcrbhdehwx2pev565iu
« Previous
Showing results 1 — 15 out of 2,357 results