The Internet Archive has digitized a microfilm copy of this work. It may be possible to borrow a copy for reading.
Padun, Gheorghe, Grzegorz Rozenberg, and Arto Salomaa. 1994. Marcus contextual grammars: Modularity and leftmost derivation. ... Linguistics and Philosophy, 8:333-343. Sleator, Daniel and D. Temperley. 1991. Parsing English with a link grammar. ...
Lecture Notes in Computer Science
in the derived CFG (Rayner et a l, 2000b). ... Takezawa et a l (1991) categorizes linguistic constraints into syntactic, semantic, pragmatic and contextual constraints. ... Menzel (1995) suggested a unified approach by using the constraint grammar formalism to express syntactic, semantic and pragmatic linguistic constraints. ...doi:10.1007/978-3-540-24840-8_61 fatcat:wfzhzfowxjgb3e362hx2vifmci
The parser builds fully connected derivations incrementally, in a single pass from left-to-right across the string. ... The basic parser and conditional probability models are presented, and empirical results are provided for its parsing accuracy on both newspaper text and spontaneous telephone conversations. ... A derivation that always replaces the rightmost non-terminal is called rightmost, and a derivation that always replaces the leftmost nonterminal is called leftmost. ...arXiv:cs/0105019v1 fatcat:2hh2zt43drazfnpef2m43e6ica
Lecture Notes in Computer Science
Franc Grootjen and Frank Nusselder have put tremendous effort in the successful compilation of the Grammar Workbench (Chapter 7). ... We received kind help from Job Honig, Theo Vosse, John Carroll, and Hans de Vreught in finding a practical grammar for testing our algorithms from Chapter 4 on. ... The second part constitutes the actual calculation of the affixes, and the resolution of ambiguities by means of interaction with the user. ...doi:10.1007/3-540-10284-1_6 fatcat:es5rjudpajglpmkjzzwsqsipzy
Two important recent trends in nlg are (i) probabilistic techniques and (ii) comprehensive approaches that move away from traditional strictly modular and sequential models. ... The generators were evaluated in terms of output quality, development time and computational efficiency against (i) human forecasters, (ii) a traditional handcrafted pipelined nlg system, and (iii) a halogen-style ... Many thanks to John Carroll, Roger Evans, Gerald Gazdar, Daniel Paiva, Ehud Anja Belz Reiter, Kees van Deemter, David Weir and in particular the anonymous nle reviewers, for very helpful comments. ...doi:10.1017/s1351324907004664 fatcat:b64n4u6tcndojkqoc6t2t7xyhi
In addition, we show the effectiveness of our architecture by evaluating on treebanks for Chinese (CTB) and Japanese (KTB) and achieve new state-of-the-art results. ... Through self-training and co-training with the two classifiers, we show that the interplay between them helps improve the accuracy of both, and as a result, effectively parse. ... Acknowledgements The authors would like to thank the anonymous reviewers, Alexandra Birch, Frank Keller, Ankur Parikh, Marcio Fonseca, Ronald Cardenas, Zheng Zhao and Yftah Ziser for their feedback on ...arXiv:2110.02283v2 fatcat:o32ajiq6tffgnfuphhxn6ovqxa
(English summary) 2002a:68059 Dassow, Jiirgen (with Mitrana, Victor) On the leftmost derivation in cooperating grammar systems. ... (English summary) 2002d:68025 Hutter, Marcus New error bounds for Solomonoff prediction. ...
The Handbook of Computational Linguistics and Natural Language Processing
This outstanding multi-volume series covers all the major subdisciplines within linguistics today and, when complete, will offer a comprehensive survey of linguistics as a whole. ... ACKNOWLEDGMENT I am greatly indebted to Constantin Orȃsan, Gloria Corpas, Elena Lloret, Laura Pack-Hagan, Erin Phillips, and especially Richard Evans for their help, comments, and suggestions. ... Parts of this paper were written while the second author was a research fellow at the Center for the Study of Language and Information, Stanford University. ...doi:10.1002/9781444324044.ch10 fatcat:peq2ppl6gnfklh7gtwzbrt5xym
Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing Volume 1 - EMNLP '09
We also present an approximation to entropy measures that would otherwise be intractable to calculate for a grammar of that size. ... In this paper, we present novel methods for calculating separate lexical and syntactic surprisal measures from a single incremental parser using a lexicalized PCFG. ... Any opinions, findings, conclusions or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the NSF. ...doi:10.3115/1699510.1699553 fatcat:iqang7qgpvat5jlzazhnpa5uaq
This suggests that neuroscience and linguistics converge on the view that, to a large extent, language acquisition arises due to our genetic endowment. ... Our research has also shown how statistical dependencies and the ability to draw structural generalizations are basic processes that interact intimately. ... 68/2002 and by the Regione Friuli-Venezia Giulia (L.R. 3/98). ...pmid:16649719 fatcat:5cfahfad6neo5b47tsd23wu4iy
Word Knowledge and Word Usage
Acknowledgments: We gratefully acknowledge the European Science Foundation Research Networking Programmes, and the former ESF Standing Committee for the Humanities for their great foresight and support ... We express our hope that similar programmes will continue to be launched and funded in Europe in the years to come. ... modularity between grammar and pragmatics. ...doi:10.1515/9783110440577-002 fatcat:3gntrjuqefg3fk64dvgw253p2i
This is "Chunk-and-Pass" processing. Similarly, language learning must also occur in the here and now, which implies that language acquisition is learning to process, rather than inducing, a grammar. ... Chunk-and-Pass processing also helps explain a variety of core properties of language, including its multilevel representational structure and duality of patterning. ... The greater memory capacities and ability to use contextual and other pragmatic cues to infer meanings, may relax the Now-or-Never bottleneck, nudging grammars toward morphological simplification with ...doi:10.1017/s0140525x1500031x pmid:25869618 fatcat:n5cecbsddvdgbcwid5w2gmr6ay
We demonstrate that, with such formulation, syntactic, semantic, and morpho-syntactic dependencies are all analysable as grounded in their potential for interaction. ... We argue that to reflect participant interactivity in conversational dialogue, the Christiansen & Chater (C&C) perspective needs a formal grammar framework capturing word-by-word incrementality ... The greater memory capacities and ability to use contextual and other pragmatic cues to infer meanings, may relax the Now-or-Never bottleneck, nudging grammars toward morphological simplification with ...doi:10.1017/s0140525x15000849 pmid:27562087 fatcat:5fmm7eb6bzfepo4jjjwqoblt5e
We start at the leftmost node in the* graph, and at the left of the sentence. ... ATNs  ) • Semantic grammars (e.g. LIFER  , SOPHIE  ) • Case frame instantiation (e.g. ELI  ) • Wait and see (e.g. Marcus  ) • Word expert (e.g. ...doi:10.1184/r1/6602801.v1 fatcat:nwoix5dg7bfqvlnb3duzw247mu
The grammar is shared by the speech recognizer, which uses a context free grammar as a language model, and by the dialog manager, which uses these grammars for natural language understanding and contextual ... and contextual interpretation. ...doi:10.5445/ir/1000019778 fatcat:wwvxi4yjfra6npl4hdpfnsj4gm
« Previous Showing results 1 — 15 out of 36 results