A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is `application/pdf`

.

## Filters

##
###
Some Classes of Regular Languages Identifiable in the Limit from Positive Data
[chapter]

2002
*
Lecture Notes in Computer Science
*

Inference with Multinomial Tests
p. 149

doi:10.1007/3-540-45790-9_6
fatcat:nmlknwqoyfbybhb6rpomqrn7qy
*Learning**Languages*with Help p. 161 Incremental*Learning*of Context Free Grammars p. 174 Estimating Grammar Parameters Using Bounded Memory p. 185*Stochastic*...*Languages*p. 283 Software Descriptions The EMILE 4.1 Grammar Induction Toolbox p. 293 Software for Analysing Recurrent Neural Nets That*Learn*to Predict Non-*regular**Languages*p. 296 A Framework ...##
###
Page 1778 of Mathematical Reviews Vol. 41, Issue 6
[page]

1971
*
Mathematical Reviews
*

A fuzzy automaton behaves in a

*deterministic*fashion. However, it has many properties similar to those of*stochastic*automata. Its application as a model of*learning*systems is discussed. ... T. 9670 Linear*regular**languages*. I. Acta Cybernet’ 1 (1969), 3-12. ...##
###
Ten Open Problems in Grammatical Inference
[chapter]

2006
*
Lecture Notes in Computer Science
*

They cover the areas of polynomial

doi:10.1007/11872436_4
fatcat:5snm4lpumbhw5e7ffh5vwxzhum
*learning*models,*learning*from ordered alphabets,*learning**deterministic*Pomdps,*learning*negotiation processes,*learning*from context-free background knowledge. ... Work with Henning Fernau on polynomial*learning*is where the ideas in section 3 come from. Philippe Jaillon gave me the initial ideas for the negotiation problem in section 9. Discussions with ... Testing equivalence of*regular**deterministic*distributions*Learning**stochastic**languages*is an important topic in grammatical inference. ...##
###
A bibliographical study of grammatical inference

2005
*
Pattern Recognition
*

The field of grammatical inference (also known as grammar induction) is transversal to a number of research areas including machine

doi:10.1016/j.patcog.2005.01.003
fatcat:62qwskiqcvddjobakbdshwebqq
*learning*, formal*language*theory, syntactic and structural pattern recognition ... Therefore,*learning*a*stochastic*automaton involves a modification of bias from what has been presented before: even if the underlying*language*is*regular*, the distribution may not be. ... The theory The main focus of research in the field of grammatical inference has been set on*learning**regular*grammars or*deterministic*finite automata (DFA). ...##
###
Page 1554 of Mathematical Reviews Vol. 57, Issue 4
[page]

1979
*
Mathematical Reviews
*

S. 57 + 11976

*Learning*with*stochastic*automata and*stochastic**languages*. With discussion. Computer oriented*learning*processes (Proc. NATO Advanced Study Inst., Bonas, 1974), pp. 69-107. ... This paper contains a survey of*learning*algorithms for*stochastic*automata and inference procedures for*stochastic*grammars. ...##
###
Learning Deterministic Regular Expressions for the Inference of Schemas from XML Data

2010
*
ACM Transactions on the Web
*

Unfortunately, there is no algorithm capable of

doi:10.1145/1841909.1841911
fatcat:hre7agfyuzhudl3xwlaxxd4xg4
*learning*the complete class of*deterministic**regular*expressions from positive examples only, as we will show. ... Inferring an appropriate DTD or XML Schema Definition (XSD) for a given collection of XML documents essentially reduces to*learning**deterministic**regular*expressions from sets of positive example words ... Since*deterministic**regular*expressions like a * define infinite*languages*, and since every non-empty finite*language*can be defined by a*deterministic*expression (as we show in the full version of this ...##
###
Learning deterministic regular expressions for the inference of schemas from XML data

2008
*
Proceeding of the 17th international conference on World Wide Web - WWW '08
*

Unfortunately, there is no algorithm capable of

doi:10.1145/1367497.1367609
dblp:conf/www/BexGNV08
fatcat:bqs6npqyi5eaxlzv5n3ktgb6di
*learning*the complete class of*deterministic**regular*expressions from positive examples only, as we will show. ... Inferring an appropriate DTD or XML Schema Definition (XSD) for a given collection of XML documents essentially reduces to*learning**deterministic**regular*expressions from sets of positive example words ... Since*deterministic**regular*expressions like a * define infinite*languages*, and since every non-empty finite*language*can be defined by a*deterministic*expression (as we show in the full version of this ...##
###
Stochastic

2019
*
Proceedings of the 2019 Conference of the North
*

In this paper, we propose to use the Wasserstein autoencoder (WAE) for probabilistic sentence generation, where the encoder could be either

doi:10.18653/v1/n19-1411
dblp:conf/naacl/BahuleyanMZV19
fatcat:y5cetx4ct5hdjcr6x4ibeequ4y
*stochastic*or*deterministic*. ... the*stochasticity*of the encoder. ... D and S refer to the*deterministic*and*stochastic*encoders, respectively. ↑/↓ The larger/lower, the better. ...##
###
Mungojerrie: Reinforcement Learning of Linear-Time Objectives
[article]

2021
*
arXiv
*
pre-print

Mungojerrie (https://plv.colorado.edu/mungojerrie/) is a tool for testing reward schemes for ω-

arXiv:2106.09161v2
fatcat:k7jvqed2wzebfbthfg2zwhxzo4
*regular*objectives on finite models. ... An alternative to this manual programming, akin to programming directly in assembly, is to specify the objective in a formal*language*and have it "compiled" to a reward scheme. ... A natural choice for this*language*is Linear Temporal Logic (LTL) [22, 27] , or more generally, ω-*regular**languages*[26] . ω-*regular**languages*describe infinite sequences. ...##
###
Sparseout: Controlling Sparsity in Deep Networks
[chapter]

2019
*
Lecture Notes in Computer Science
*

Sparsity is a potentially important property of neural networks, but is not explicitly controlled by Dropout-based

doi:10.1007/978-3-030-18305-9_24
fatcat:6kc7pdzm2zhxnc5mxutdiogu3y
*regularization*. ... Sparseout provides a way to investigate sparsity in state-of-the-art deep*learning*models. Source code for Sparseout could be found at . ...*Stochastic**regularization*has become the standard practice in training deep*learning*models and have outperformed*deterministic**regularization*methods on many tasks. ...##
###
Bayesian Layers: A Module for Neural Network Uncertainty
[article]

2019
*
arXiv
*
pre-print

This enables composition via a unified abstraction over

arXiv:1812.03973v3
fatcat:oxsckegvezcfljz25nlz4cfn54
*deterministic*and*stochastic*functions and allows for scalability via the underlying system. ... Finally, we show how Bayesian Layers can be used within the Edward2 probabilistic programming*language*for probabilistic programs with*stochastic*processes. ... ., 2017 ) ( Figure 6 ); or an auto-encoder with*stochastic*encoders and decoders (Figure 7) . 3 Signature To implement*stochastic*output layers, we perform*deterministic*computations given a tensordimensional ...##
###
Learning deterministic regular grammars from stochastic samples in polynomial time

1999
*
RAIRO - Theoretical Informatics and Applications
*

In this paper, the identification of

doi:10.1051/ita:1999102
fatcat:rw2vcb2qtnfo7ma5cns3dffuum
*stochastic**regular**languages*is addressed. ... For this purpose, we propose a class of algorithms which allow for the identification of the structure of the minimal*stochastic*automaton generating the*language*. ... Every*stochastic**deterministic**regular*grammar G defines a*stochastic**deterministic**regular**language*(SDRL), L G , through the probabilities p(w|L G ) = p(S ⇒ w). ...##
###
Learning Probabilistic Residual Finite State Automata
[chapter]

2002
*
Lecture Notes in Computer Science
*

We prove that there are more

doi:10.1007/3-540-45790-9_7
fatcat:go6ap7enwjhnflnwve4ayalj74
*languages*generated by PRFA than by Probabilistic*Deterministic*Finite Automata (PDFA). ... We show that this class can be characterized by a simple intrinsic property of the*stochastic**languages*they generate (the set of residual*languages*is finitely generated) and that it admits canonical ... It consists of all*stochastic**languages*generated by probabilistic finite automata. Also, the class of*stochastic**deterministic**regular**languages*on Σ is denoted by L P DF A (Σ). ...##
###
Probabilistic finite-state machines - part II

2005
*
IEEE Transactions on Pattern Analysis and Machine Intelligence
*

This is a direct consequence of the fact that every

doi:10.1109/tpami.2005.148
pmid:16013757
fatcat:vaoopt4ypzffzpv53pxx2hodpy
*regular**language*is the support of at least one*stochastic**regular**language*, and there are*regular**languages*which are not k-testable. ... APPENDIX A.1 Proof of Theorem 3Theorem 3 (*Stochastic*morphism theorem). Let AE be a finite alphabet and D be a*stochastic**regular**language*on AE ? . ...##
###
Learning Languages with Help
[chapter]

2002
*
Lecture Notes in Computer Science
*

We propose a general setting to deal with these cases and provide algorithms that can

doi:10.1007/3-540-45790-9_13
fatcat:adbjrahojfejxieqmglom6vnay
*learn**deterministic*finite automata in these conditions. ... Grammatical inference consists in*learning*formal grammars for unknown*languages*when given*learning*data. Classically this data is raw: strings that belong to the*language*or that do not. ... An interesting alternative is to consider the hypothesis that not only is the*language**regular*, but that the distribution also is. In such a case one needs to*learn*a*Stochastic*Finite Automaton. ...
« Previous

*Showing results 1 — 15 out of 13,779 results*