Filters








7,494 Hits in 2.0 sec

Diachronic Probabilistic Grammar

Benedikt Szmrecsanyi
2013 English Language and Linguistics  
: focus on the gradient effects that language-internal, contextual predictors have on variation patterns • grammar: investigate 3 well-known syntactic alternations in the grammar of English The 100-split  ...  interest in socially contextualized language usage • language users implicitly learn the probabilistic effects of constraints on variation by constantly (re-)assessing input of spoken and written discourses  ... 
doi:10.17960/ell.2013.19.3.002 fatcat:nfdwrlkcmvcu5g6sndx4wvsddq

Grammar Acquisition and Statistical Parsing by Exploiting Local Contextual Information

Thanaruk Theeramunkong, Manabu Okumura
1998 Journal of Natural Language Processing  
based on the algorithm given in section 2.Brackets(rules)which are occurred more than 40 times in the corpus are considered and the number of contexts used is determined by the criterion described in  ...  of the probability instead of the original probability in order to eliminate the effect of the number of rule applications as done in (Magerman and Marcus Utilizing top N contexts,we learn the whole grammar  ...  acquisition based on local contextual information and clustering analysis.  ... 
doi:10.5715/jnlp.5.3_107 fatcat:x7vbc3owifdpxax3un5btdwjxe

Page 3631 of Mathematical Reviews Vol. , Issue 2000e [page]

2000 Mathematical Reviews  
Summary: “Probabilistic analogues of regular and context-free grammars are well known in computational linguistics, and cur- rently the subject of intensive research.  ...  Summary: “The paper discusses some classes of contextual grammars—mainly those with ‘maximal use of selectors —giving some arguments that these grammars can be considered a good model for natural language  ... 

Page 250 of Computational Linguistics Vol. 27, Issue 2 [page]

2001 Computational Linguistics  
Computational Linguistics Volume 27, Number 2 each prefix string from the probabilistic grammar, and hence a conditional probability for each word given the previous words and the probabilistic grammar  ...  [he following section will provide some background in probabilistic context-free grammars and language modeling for speech recognition.  ... 

Arabic Probabilistic Context Free Grammar Induction from a Treebank

Nabil Khoufi, Chafik Aloulou, Lamia Hadrich Belguith
2015 Research in Computing Science  
In this paper, we present our method to automatically induce a syntactic grammar from an Arabic annotated corpus (The Penn Arabic TreeBank), a probabilistic context free grammar in our case.  ...  Finally, we present and discuss the obtained grammar.  ...  based on the induced grammar described in this paper.  ... 
doi:10.13053/rcs-90-1-6 fatcat:gjfrrc52qvgsfjnyjz3bv6ek4e

How Blue Can You Get? Learning Structural Relationships for Microtones via Continuous Stochastic Transduction Grammars

Dekai Wu
2016 International Conference on Computational Creativity  
Linguistic and grammar oriented models for music commonly approximate features like pitch using discrete symbols to represent 'clean' notes on scales.  ...  grammars.  ...  Stochastic transduction grammars generalize stochastic grammars to model two streams instead of one.  ... 
dblp:conf/icccrea/Wu16 fatcat:tfgwqh5izfgzjclucoeuuvoc3u

Page 327 of Computational Linguistics Vol. 21, Issue 3 [page]

1995 Computational Linguistics  
With such a formulation, the capability of context- sensitive parsing (in probabilistic sense) can be achieved with a context-free grammar.  ...  It is interesting to compare our frameworks (Su et al. 1991) with the work by Briscoe and Carroll (1993) on probabilistic LR parsing.  ... 

Page 126 of Computational Linguistics Vol. 20, Issue 1 [page]

1994 Computational Linguistics  
Machine itera- tion is constrained by augmenting the rules with additional contextual information, restrictions on the number of times some rules can apply, and probabilities.  ...  The descriptive elements (e.g., phones, contextual descriptors) come largely from linguistics, and the procedural components are primarily computational (e.g., decision trees, finite-state grammars) and  ... 

A Theoretical Foundation for Syntactico-Semantic Pattern Recognition

Shrinivasan Patnaikuni, Dr Sachin Gengaje
2021 IEEE Access  
These algorithms essentially were dependent on the syntactic grammars defining the patterns.  ...  The approach consists of integration mapping between probabilistic context free grammar (PCFG) and Multi Entity Bayesian network (MEBN), a first-order logic for modeling probabilistic knowledge bases.  ...  Though the probabilistic/stochastic grammar uses probability distribution over the rules of the grammar they are purely based on the syntactic and statistical properties of the strings in the grammar  ... 
doi:10.1109/access.2021.3115445 fatcat:oyifmss2qbfp3dv3ir5zmjfrgy

Recognition of Multi-Object Events Using Attribute Grammars

Seong-wook Joo, Rama Chellappa
2006 2006 International Conference on Image Processing  
Probabilistic parsing and probabilistic conditions on the attributes are used to achieve a robust recognition system.  ...  In contrast to conventional grammars, attribute grammars are capable of describing features that are not easily represented by finite symbols.  ...  We also combine probabilistic parsing and probabilistic conditions on the attributes to design a robust recognition system.  ... 
doi:10.1109/icip.2006.313035 dblp:conf/icip/JooC06 fatcat:nrxwzeel5fce7asihcu6t2e7mi

Reversibility reconsidered: finite-state factors for efficient probabilistic sampling in parsing and generation

Marc Dymetman, Sriram Venkatapathy, Chunyang Xiao
2015 Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing  
We restate the classical logical notion of generation/parsing reversibility in terms of feasible probabilistic sampling, and argue for an implementation based on finite-state factors.  ...  We propose a modular decomposition that reconciles generation accuracy with parsing robustness and allows the introduction of dynamic contextual factors. (Opinion Piece)  ...  Probabilistic reversibility In the classical non-probabilistic case, a (relative) consensus existed around the fact that a reversible grammar should be, as we indicated above, a formal specification of  ... 
doi:10.18653/v1/d15-1233 dblp:conf/emnlp/DymetmanVX15 fatcat:qu5vtrdf5vdabovog4o5lxnsd4

Page 3875 of Mathematical Reviews Vol. , Issue 98F [page]

1998 Mathematical Reviews  
In order to add a probabilistic aspect to the ordinary theory of graph grammars, the author introduces probabilistic graph gram- mars.  ...  Probabilistic graph sets and probabilistic hyperedge replacement grammars are thus introduced. The notion of gener- ating function is used to measure the growth of nonterminals.  ... 

A Best-Fit Approach to Productive Omission of Arguments

Eva H. Mok, John Bryant
2006 Proceedings of the annual meeting of the Berkeley Linguistics Society  
Defining Best-Fit The best-fit score of an analysis given an utterance is a probabilistic metric which combines syntactic, contextual, and semantic factors.  ...  One analysis may have the best semantic fit, and yet a different analysis may have the best contextual fit.  ... 
doi:10.3765/bls.v32i1.3462 fatcat:7mk2pn6qo5a5dgqwfqxjicxpba

Predictability and phonology: past, present and future

Jason Shaw, Shigeto Kawahara
2018 Linguistics Vanguard  
This introduction aims to contextualize the papers in the special issue within a broader theoretical context, focusing on what it means for phonological theory to incorporate gradient predictability, what  ...  Another path towards probabilistic models has been the body of evidence indicating that phonological grammar itself is deeply probabilistic: phonological patterns tend to generalize according to their  ...  One possibility is that probabilistic patterns are of precisely the same type as deterministic (i.e.  ... 
doi:10.1515/lingvan-2018-0042 fatcat:3j2te2f3xfh25cj37oylknt3m4

Page 1637 of Mathematical Reviews Vol. 42, Issue 6 [page]

1971 Mathematical Reviews  
On the other hand, with certain restrictions on the structure of the automaton or on the number of random inputs, it is possible to restrict the class of languages the probabilistic automaton will define  ...  Erol 9106 On languages defined by linear probabilistic automata. Information and Control 16 (1970), 487-501.  ... 
« Previous Showing results 1 — 15 out of 7,494 results